This story on neuro-prosthetics is part of an extended series on Regenerative Medicine. For other stories on this topic see williamhaseltine.com and search for Regenerative Medicine. My definition of Regenerative Medicine is any medical modality that returns us to normal health when we are damaged by disease, injured by trauma, disadvantaged by birth, or worn by time. Modalities include: chemicals, genes, proteins and cells used as drugs, gene editing, prosthetics, and mind-machine interfaces. 

Thanks to the work of scientists at Johns Hopkins University, a partially paralyzed man was able to feed himself using just his mind and a pair of robotic arms. The 49-year-old man suffered from a spinal injury nearly thirty years ago, leaving him with limited upper body mobility and an inability to use any of his fingers. Now, with the help of an advanced brain-machine interface, the man can command robotic prosthetic arms to cut and feed him food simply by making subtle movements with his wrists and hands in response to audio prompts such as “select cut location” or “moving food to fork”.

JWPLAYERShared Control of Bimanual Robotic Limbs With a Brain-Machine Interface for Self-Feeding

Brain-machine interfaces allow for direct communication between a person’s brain and a computer. When a microelectrode array is implanted into a person’s brain, their brain signals can be recorded and sent to a computer. The computer then decodes these signals and translates them in real-time into commands or external functions. In the past, brain-machine interfaces have restored some independence to those living with sensory or motor deficits by helping them speak, write, or perform other tasks they are normally unable to. However, most brain-machine interface systems that involve prosthetic limbs can only perform very simple movements and do not allow the user to customize the prosthetic’s actions in any way.

In a recent paper published in the journal Frontiers in Neurorobotics, scientists at Johns Hopkins describe how they developed an advanced brain-machine interface and robotic prosthetics system, that not only moves in more flexible ways but allows the user to customize the robotic prosthetic’s actions to their liking.

How many ways can you move your wrist? Most people will report 3 unique types of movement: side to side, up and down, and rotational movements. These movements are called degrees of freedom. One of the largest challenges of creating robotic prosthetics with human-like dexterity is to maximize the degrees of freedom that each joint in a prosthetic limb has while maintaining the user’s ability to control each type of movement.

Previous work has successfully developed fully controlled robotic prosthetics with 7-10 degrees of freedom. However, creating a brain-machine interface that interacts with two robotic prosthetics at a time would require the Johns Hopkins team to develop a way for the user to control as many as 24 degrees of freedom.

To address this challenge, the researchers sought to use a system called collaborative shared control. Shared control systems are a strategy in which the brain-machine interface user and a semi-autonomous robot work together to accomplish tasks. Collaborative shared control occurs when only a subset of a prosthetic’s degrees of freedom are controllable by the user during task-specific times. For instance, if a prosthetic was tasked with picking up an object, the user may have control over the prosthetic’s side-to-side movements to determine the position that the prosthetic lifts the object from. By using a collaborative shared control system, researchers could allow the user to customize and control the robotic prosthetic’s movements while still minimizing the user’s workload.

After determining their control strategy, the next step was to determine how the user could communicate specific actions to the robotic arms. Since the 49-year-old test subject could perform subtle movements with his wrists and hands, the team decided to use muscle movement signals from his hands and wrists to control the prosthetics. For instance, an open palm would cause the robot to move upwards. By pinching two fingers together, the user could command the robot to move downwards.

Since a key feature of a collaborative shared control system is that the user only has control during task-specific times, the Johns Hopkins team programmed the computer to play audio prompts when the robotic arms began to perform a task. These audio prompts would give the user an opportunity to respond with hand/wrist gestures, allowing the user to guide the prosthetic’s positioning during the task.

After lots of practice using the dual brain-machine interface and robotic prosthetics system, the 49-year-old test subject was able to successfully communicate specific neural signals to the computer using his hand/wrist gestures 85% of the time and was ready to participate in a more complicated experiment.

The culmination of this study was a self-feeding experiment. A dessert pastry was placed on a table between two robotic prosthetic arms equipped with a fork and a knife. The participant was then tasked to cut a piece of the pastry and bring it to his mouth using neural signals from his hand/wrist gestures. A trial would be considered fully successful if the participant was able to cut the pastry and fully complete the self-feeding portion of the task.

Figure 1: Self-feeding task
Figure 1: Self-feeding taskHANDELMAN ET AL., FRONTIERS IN NEUROROBOTICS (2022) DOI:HTTPS://DOI.ORG/10.3389/FNBOT.2022.918001

After calibrating the robotic prosthetics and performing thirty-seven trials total, researchers found that the participant was able to successfully cut off a reasonably sized piece of food and bring it to his face without dropping it in seven of the trials. In seven partially successful trials, the participant was able to cut the food, but the food fell off the fork. In a total of twenty-six out of the thirty-seven trials, the participant was able to demonstrate control of both prosthetics.

While these results are preliminary, this study marks a crucial step in advancing the fields of neurorobotics, smart prosthetics, and regenerative medicine. As brain-machine interfaces and robotic prosthetics continue to advance, our ability to return greater independence and function to those who are either disabled or suffer from sensorimotor deficits will only grow.