The overall aim of this project is to devise a method of shared control for a robotic manipulator controlled using non-invasive brain signals.
Many of the current brain-computer interface systems (BCI) for robotic control use paradigms which are mentally taxing on the user, whilst not providing the full breadth of motion that a human arm affords.
With these two limitations in mind, the objectives of this project are to 1) explore different forms of imagery to enhance the functionality and ease-of-use of a BCI controlled arm and 2) develop a method of shared control, where the decision making for the actions of the robot can be handed over to an autonomous system when required.
To achieve these goals ideas from, robotics, artificial intelligence, neuroscience and psychology will be combined to create this BCI robotic system. The hope is that systems like these will be used by those who have lost function in their upper limbs, allowing their quality of life and independence to be improved.
Perception-action coupling/ Active Inference
Robotic control
Human Augmentation
Brain-computer interfaces
Biosignal data
Synthetic data
BSc Biomedical Science, University of Birmingham.
MSc Computational Neuroscience and Cognitive Robotics, University of Birmingham.
Dr Dingguo Zhang
Prof Damien Coyle
Prof Danae Stanton Fraser