By Gordy Slack
In the 1986 sci-fi classic Aliens, Sigourney Weaver’s character defeats her nemesis with a huge exoskeleton device that she operates from inside with buttons and a joystick. Twenty-two years later, Tony Stark steps inside Iron Man, a much sleeker and more agile version of a wearable robot. How was Iron Man manipulated? Possibly with sensors that could read electrical signals coming from the brain with an EEG-like device embedded in the helmet or implanted into the motor cortex of the brain. Or maybe it was a simpler touch interface that would simply read the contact force applied by the body itself and amplify or convert its motions into movement of the robotic exoskeleton.
All of those approaches are feasible and, in fact, being investigated or implemented by engineers around the world in efforts to explore alternative ways to operate the coming generation of wearable machines.
Jacob Rosen and his colleagues in the UCSC Bionics Lab are taking a different approach. They have developed a robotic arm controlled by the electrical signals sent by the brain through the nerves to contract the muscles – signals known as electromyograph (EMG). These signals can be sensed by electrodes attached to the skin in key locations above the muscles.
There are several advantages to operating robotic devices from EMG signals, Rosen says. It is less invasive and expensive than other “upstream” (closer to the brain) sources, such as electrodes implanted directly in the motor cortex, an approach that also requires a long period of training to generate the appropriate control signals. Nor does interpreting EMGs lead to brain tissue death, as implants do, eventually making the implants obsolete anyway.
EMG also has significant advantages over further “downstream” methods, such as simple touch interfaces. First this approach can teach us a lot about muscle physiology and improve our ability to simulate and predict specific movements. Second, EMG has timing in its favor.
Our neuromuscular system operates with an inherent time delay, known as the electromechanical time delay, of somewhere between 50 and 300 milliseconds after a command to move is initiated by the motor cortex in the brain and communicated to the muscle by the nerve in, say, the arm, but before the muscle mechanically contracts.
“That is plenty of time,” Rosen says, “to take the neural signal along with the joint angle and velocity, and predict what the muscle is about to do, and then do the same thing, only amplified, with the robotic exoskeleton.
“It is like walking with a dog,” says Rosen. “If the dog is young and does not know its way, it is going to pull the leash. You would have to apply a force on the dog to direct it. [Like the old, touch interface.] But if the dog knows its way, then the leash can be loose. We are trying to allow a loose-leash situation by developing software that employs algorithms that emulate the muscle physiology, also known as a myoprocessor, to predict what a muscle is going to do before it has begun to do it.”
“Developing the myoprocessor model is difficult,” Rosen says, “partly because the body is so redundant.” There are three different muscles that are responsible for flexing an elbow, for example. This adds another level of complexity that must be incorporated into the algorithms even though the EMG signals are collected from only one muscle out of the three.
While Rosen is modeling the muscles to make his wearable robots more responsive to human intentions, he is also using the robots to help study how human motion works, how it sometimes does not, and how best to repair it when it is not working properly.
“I focus on medical applications, specifically on rehabilitation,” says Rosen, who is working with a physical therapist at UCSF and a neurologist at the San Francisco VA Medical Center on projects that will, he hopes, help stroke victims and victims of other neural damage recover use of their limbs.
Recovering from neurological disorder often requires the rewiring of the brain, making new pathways through undamaged brain territory, says Rosen. And that requires repetition of movement. “Lots of repetition,” he says. “So many repetitions, in fact, that therapists do not always have the time themselves to deliver such treatments.”
The device Rosen has developed, called the EXO-UL7, can stand in for the therapist, allowing what little muscle control persists in a damaged arm to move the whole arm plus a load with the aid of the robot. The EXO-UL7 compensates for gravity and lets the patient concentrate on control alone,” says Rosen. And the patient can do that for however long is optimal for recovery, disregarding the schedule of the physical therapist.
Rosen’s group also plans to combine the robotic suit with virtual reality games to ease the tedium of doing long hours of physical therapy. Their research is currently funded in part by the Telemedicine and Advanced Technology Research Center, a U.S. Army health and medical tech funding organization.
The current prototype is anchored to the wall, but future freestanding ones could be used as robotic prosthetics, allowing users with permanent nerve damage to enjoy a free range of movement and to conduct ordinary tasks. Or it could be used to amplify normal human strength. In fact there are at least two models of exoskeletons, mostly developed with DARPA funding for military use, that do just that now nearing implementation, one made in the Berkeley Robotics and Human Engineering Lab of Homayoon Kazerooni and the other by Sarcos, a private company in Salt Lake City, Utah. Neither one uses EMG.
Rosen’s wearable robot could also be employed for the quick and responsive operation of tools at a distance, a function that is key for the implementation of various kinds of telemedicine, like remote surgery. Or for conducting experiments in hard-to-reach or dangerous environments, such as battlefields or other planets.