Background: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring\nupper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may\nnot always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to\nimprove BMI performance. We describe a method of shared control where the user controls a prosthetic arm using\na BMI and receives assistance with positioning the hand when it approaches an object.\nMethods: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and\nwithout shared control. The shared control system was designed to provide a balance between BMI-derived\nintention and computer assistance. An autonomous robotic grasping system identified and tracked objects and\ndefined stable grasp positions for these objects. The system identified when the user intended to interact with an\nobject based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled\nmovements and autonomous grasping commands were blended to ensure secure grasps.\nResults: Both subjects were more successful on object transfer tasks when using shared control compared to\nBMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult.\nOne participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in\n92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.\nConclusions: Integration of BMI control with vision-guided robotic assistance led to improved performance on\nobject transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive\nto potential users.
Loading....