Most available prosthetics use electromyogram (EMG) sensors to translate muscle movement where the limb was removed to hand and finger motions. But those types of sensors are pretty limited in what they can do. “EMG sensors aren’t very accurate,” Gil Weinberg, the professor leading the project, said in a statement. “They can detect a muscle movement, but the signal is too noisy to infer which finger the person wants to move.” So the team took their prosthetic one step further and attached an ultrasound probe. Just as physicians can use ultrasound machines to take a look at a fetus inside of a womb, the probe can see which muscles are moving in an amputee’s arm. Algorithms can then translate that into individual finger movements. “By using this new technology, the arm can detect which fingers an amputee wants to move, even if they don’t have fingers,” said Weinberg.
There are a number of groups working on improving prosthetics and trying to make them more lifelike. Some of those efforts include introducing tactile feedback to let users know where their prosthetic is without having to look and giving prosthetics the ability to see what they need to grasp. DARPA even has an advanced prosthetic named LUKE, also inspired by Skywalker.
This isn’t the first prosthetic built for Barnes by the Georgia Tech team. In 2014, they gave him an arm that let him play drums. It even had a second drumstick that moved based on the music being played and could play faster than any human drummer. About his second, dexterous prosthetic, Barnes said, “It’s completely mind-blowing. This new arm allows me to do whatever grip I want, on the fly, without changing modes or pressing a button. I never thought we’d be able to do this.”