Vision based methods commonly use RGB cameras to detect the environment and the context of the gesture, but these come with privacy risks. This project focuses on how nonvisual signals, such as EEG, EMG and IMU, can infer early intent of a gesture enabling seamless interactions with computers. This approach also has potential applications in providing real-time feedback to improve user performance in complex tasks such as CPR or during physical therapy.
Socially acceptable wearable sensing
Signal processing and analysis
Machine learning
Multimodal sensing
Real-time assistance and feedback
MEng Integrated Mechanical and Electrical Engineering – University of Bath
Dr Adwait Sharma
Prof Eamonn O’Neill
Dr Leen Jabban