Researchers at MIT have developed a wearable sensor that can detect small deformations of the skin, potentially serving as a way to help amyotrophic lateral sclerosis (ALS) patients to communicate through facial movements. The low-cost sensors are much cheaper and may be more effective than current assistive communication technologies for ALS patients.
Canan Dagdeviren, the lead researcher on the project, and who has been previously interviewed by Medgadget, became inspired to develop technology to assist those with ALS to communicate after meeting Professor Stephen Hawking in 2016, and noting that the technology he used to communicate was not optimal.
Hawking, who passed away in 2018, used an infrared sensor to detect twitches in his cheek, which he used to select letters on a computer screen. The method worked, but was slow. Other techniques involve measuring the activity of the facial nerves in ALS patients, but these can suffer from poor accuracy.
“These devices are very hard, planar, and boxy, and reliability is a big issue,” said Dagdeviren. “You may not get consistent results, even from the same patients within the same day.” To address these issues, Dagdeviren and colleagues have designed a simpler system that relies on a wearable skin-like sensor that ALS patients can wear on their faces.
The wearable includes a silicone film that contains four piezoelectric sensors. The sensors can detect deformation of the underlying skin and convert this to an electrical signal. This could mean that gentle facial expressions, such as a smile or twitch, could be registered as specific messages, such as “I’m hungry”. These facial movements, alone or in combination, could correspond to a library of phrases, aiding rapid communication.
“We can create customizable messages based on the movements that you can do,” said Dagdeviren. “You can technically create thousands of messages that right now no other technology is available to do.