Thoughts make Robot Hand Pinch and Scoop
A woman with quadriplegia was able to manipulate a robot hand into four positions using just her thoughts to successfully pick up big and small boxes, a ball, an oddly shaped rock, and fat and skinny tubes.
The findings describe, for the first time, 10-degree brain control of a prosthetic device, in which the woman was able to maneuver the hand into four positions: fingers spread, scoop, pinch, and thumbs up.
Four of the gestures possible with the robotic hand, clockwise from top left: scoop, opposition, spread, and pinch. (Credit: Journal of Neural Engineering/IOP Publishing)
The achievement, described in the Journal of Neural Engineering, is a further demonstration of how brain-computer interface technology has the potential to improve the function and quality of life of those unable to use their own arms.
“Our project has shown that we can interpret signals from neurons with a simple computer algorithm to generate sophisticated, fluid movements that allow the user to interact with the environment,” says senior investigator Jennifer Collinger, assistant professor of physical medicine and rehabilitation (PM&R) at University of Pittsburgh School of Medicine.
In February 2012, small electrode grids with 96 tiny contact points each were surgically implanted in the regions of trial participant Jan Scheuermann’s brain that would normally control her right arm and hand movement.
Each electrode point picked up signals from an individual neuron, which were then relayed to a computer to identify the firing patterns associated with particular observed or imagined movements, such as raising or lowering the arm, or turning the wrist.
That “mind-reading” was used to direct the movements of a prosthetic arm developed by Johns Hopkins Applied Physics Laboratory.
Within a week of the surgery, Scheuermann could reach in and out, left and right, and up and down with the arm to achieve 3D control, and before three months had passed, she also could flex the wrist back and forth, move it from side to side, and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to 7D control. Those findings were published inThe Lancet.
“In the next part of the study, described in this new paper, Jan mastered 10D control, allowing her to move the robot hand into different positions while also controlling the arm and wrist,” says Michael Boninger, professor and chair of PM&R and director of the UPMC Rehabilitation Institute.
To bring the total of arm and hand movements to 10, the simple pincer grip was replaced by four hand shapes: finger abduction, in which the fingers are spread out; scoop, in which the last fingers curl in; thumb opposition, in which the thumb moves outward from the palm; and a pinch of the thumb, index, and middle fingers.
As before, Scheuermann watched animations of and imagined the movements while the team recorded the signals her brain was sending in a process called calibration. Then, they used what they had learned to read her thoughts so she could move the hand into the various positions.
“Jan used the robot arm to grasp more easily when objects had been displayed during the preceding calibration, which was interesting,” says co-investigator Andrew Schwartz, a professor of neurobiology. “Overall, our results indicate that highly coordinated, natural movement can be restored to people whose arms and hands are paralyzed.”
After surgery in October to remove the electrode arrays, Scheuermann concluded her participation in the study.
The Defense Advanced Research Projects Agency, the Department of Veterans Affairs, and the UPMC Rehabilitation Institute funded the project.
Source: University of Pittsburgh
This article originally appeared on Futurity on 19 December 2014.
Header Photo Credit: DARPA and JHU/APL