New research from the University of Chicago and Johns Hopkins University may be on the right track to providing robotic prosthetic limbs with a sense of touch.
Losing the ability to move is a traumatic experience, mentally and physically, but recent advances in medical technologies have helped tetraplegics regain a part of their lives they once thought to be lost forever. Losing the ability to move results in the loss of touch, a sense necessary to perform the most basic tasks – from opening doors, to tying shoes, to picking up objects. Without the sense of touch, simple tasks become daily struggles for tetraplegics. New research from the University of Chicago may be on the right track to providing robotic prosthetic limbs with a sense of touch.
Sliman Bensmaia, PhD assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago, was studying the neural basis of touch in graduate school at Johns Hopkins University when he was approached by a member of the Johns Hopkins Applied Physics Lab, and asked to join a research project that focused on developing an anthropomorphic prosthetic (modular limb)1 for tetraplegic patients.
In tetraplegic patients, or quadriplegics, the brain is no longer connected to the patient’s body, but the part of the brain that controls the body remains. “When you imagine moving your arm, that part of the brain is still active, but nothing happens due to the lost connection,” explains Bensmaia. “The idea behind the project was to stick electrodes in the brain and stimulate it directly to produce some percepts of touch to better control the modular limb.” Without the sensory feedback, the modular limb would not be clinically viable, because most patients would not want to undergo the dangerous surgery that these devices entail without sensory feedback.
When Bensmaia was first approached about the project, he thought it wasn’t feasible. The brain has over 90 billion neurons, and trying to directly interface with the brain seemed to face insurmountable obstacles. “However, when you think about it, we might not be able to restore touch completely, but we can do something useful for a specific patient population,” says Bensmaia. “It’s also a really challenging and exciting endeavor,” he adds.
During the experimental phase, the main challenge was trying to create specific touch perceptions in monkeys and trying to figure out what they were feeling. Behavioral experiments were designed to help the team infer what they felt based on their reactions. “We were able to train the animals to do perceptive discriminations with pokes on their hand,” explains Bensmaia. “We then surreptitiously replaced some of the pokes with electrical stimulation to the brain to see if the animal would behave as if we poked its hand.”
Another challenge was trying to figure out how to stimulate the brain, which is a very complex organ. Electrical stimulation is a very blunt tool, so how can it be used to do something useful with the brain? “By using it as intelligently and as deliberately as we could, we were able to reproduce natural patterns of neural activity that represented the pokes on the hand,” says Bensmaia. He also discusses a set of experiments published in the
Proceedings of the National Academy of Sciences (PNAS) that demonstrates how electrical stimulation of the somatosensory areas creates systematic and meaningful percepts while being safe. “There is also research that is about to be published that shows even if the brain is stimulated for long periods of time it won’t get damaged.”
A third challenge involved the different apparatuses that needed to work in concert. Pieces of apparatus were needed to measure the activity, which required hundreds of electrodes implanted in the brain. The signals from the electrodes had to be read and interpreted using a data acquisition system. “The hand was then stimulated with a mechanical stimulator to probe how the system responded to stimulation of the hand,” says Bensmaia. “Once the hand was stimulated, we were able to stimulate the brain in precise ways.”
Bensmaia’s team also monitored the animals’ behavior using eye-tracking, and rewarded the animals when they did something right. “All of these engineering pieces need to work together for it to be successful,” he says.
There are other research groups working with sensitized prosthetics, such as the University of Southern California, which is working on sensitized fingertips. However, most previous efforts were proofs of principle that if the brain is stimulated, percepts can be evoked and guide behavior. Bensmaia’s research is an actual blueprint to implement this sort of technology into humans.
The blueprint shows how to connect the sensors on the hand to the electrodes implanted in the brain, and how to convert the tentative force in the sensors to patterns of electrical stimulation of the brain. “The blueprint is ready to go, and that’s the biggest thing for me,” says Bensmaia. “We have an algorithm that can be implanted in a human.”
Other research groups have tried to connect the sensors to the electrodes in a quasi or arbitrary way through systematic mappings that require the patients to learn. The challenge, according to Bensmaia, is that even though the brain can learn to associate things, the hand is a very complex organ. “It has 20-some-degrees of freedom, it can move in a lot of different ways, and it can touch a lot of things — different shapes, sizes, textures, etc.,” says Bensmaia. “So the sensory input that you get from the hand is very complex, and I’m not convinced that an adult patient can learn truly arbitrary patterns.”
Naturalistic sensory percepts need to be created for patients to be able to use neuroprosthetics, and Bensmaia’s team has developed a technique to elicit such naturalistic percepts through electrical stimulation of the brain. Various experiments are involved to obtain certain information; for example, if an object is grabbed with the hand, is the object touching the hand? Which parts of the hand? The opposing forces on the object also need to be known. “You want to know how much force is needed to pick up the object, and you don’t want to destroy the object once you pick it up,” explains Bensmaia. “This was the kind of information that we wanted to convey.”
The algorithms Bensmaia’s team created showed how to connect the sensors on the prosthetic hand to the electrodes implanted in the brain, and how to convert the time varying output of the sensors into electrical stimulation applied to the brain.
Human Testing & Robust Arrays
This type of technology is ready for human testing, but one of the frustrating components is the arrays required to convert and transfer the signals. “These arrays are only good for a few years before you need to change them, and replacing them is not an option in a human patient,” says Bensmaia. The ultimate bottleneck for this project is producing an array that is robust enough to last the rest of the life of the patient. A lot of the other technology involved also needs to be miniaturized in order for these types of prosthetics to be truly viable in the clinic.
The arrays also need to be placed in the cortex of the brain, which not only has a risk factor for a patient’s life, but when something is done to the body, the body reacts, creating scar tissue around the area and eventually encapsulating the array and forcing it to shut down.
“The first patient population we’re targeting is tetraplegics because the sensorimotor parts of their brain are still there, but they are not being used for anything,” says Bensmaia. “The consequences of their injury is really devastating to their quality of life, so they are more motivated and willing to endure the risks involved with the surgery.”
If the surgeries are successful, patients start to regain the majority of their motor function, and the technology becomes reliable and robust, this type of technology can transition to amputees. The hope is that, in the next 20 years, more patients will be seen with these types of prosthetics and they will be able to control them. WDD
This article originally appeared in the January/February print issue. Click here to read the full issue.
Filed Under: Aerospace + defense, M2M (machine to machine)