Researchers at the University of California, Berkeley have developed a wearable sensor that can measure electrical signals in the forearm and use AI to correlate them with hand gestures, such as the movements of individual fingers. The team demonstrated that the system can control a robotic prosthetic hand and that it may provide a way for amputees to perform delicate movements with such devices.
The flexible sensor can measure electrical signals at 64 discrete areas on the forearm and an electrical chip then uses AI to interpret these signals as specific Prosthetic hand gestures. A user can train the system to recognize unique hand gestures, and so far, the team has successfully trained it to accurately recognize 21 different gestures, including a flat hand, a thumbs up, and a fist.
“When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibers in your arms and hands,” said Ali Moin, a researcher involved in the study. “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibers were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”
The system uses AI to interpret the signals. This occurs on-board and does not rely on cloud computing, which makes the data interpretation faster and helps to keep patient data secure and private. “In our approach, we implemented a process where the learning is done on the device itself,” said Jan Rabaey, another researcher involved in the Prosthetic project. “And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it.”
Also Read: Sensifirm Cellulite Reduction