Bioengineers at the UCLA Samueli School of Engineering have come up with a glove-like device that can translate American Sign Language into English speech in real-time through a smartphone app. The system includes a pair of gloves with thin, stretchable sensors that run through the length of each of the five fingers.
The sensors on the glove are made from electrically conducting yarns and pick up hand motions and finger placements (that stand for individual letters, numbers, words, and phrases). The device then turns the finger movements into electrical signals, which are sent to a dollar-coin–sized circuit board which is worn on the wrist. The board transmits those signals wirelessly to a smartphone that translates them into spoken words at the rate of about one word per second.
Also, the device includes adhesive sensors that fit on the testers’ faces (in between their eyebrows and on one side of their mouths) to capture facial expressions that are a part of American Sign Language. The device is made from lightweight and inexpensive but long-lasting, stretchable polymers. The electronic sensors are also very flexible and inexpensive.
The wearable device has been tested on four deaf people who use American Sign Language. The wearers repeated each hand gesture 15 times and a custom machine-learning algorithm turned these gestures into the letters, numbers, and words they represented. The system recognized 660 signs, including each letter of the alphabet and numbers 0 through 9. The device is neither bulky nor uncomfortable and makes it easier for the people who use sign language to communicate directly with non-signers without the help of someone else. The commercial model based on this technology would require added vocabulary and faster translation time.