The idea was developed by staff at the University of California in Los Angeles. The device tracks a user's movements, converting them into speech.
American sign language contains a variety of different hand movements and facial expressions, therefore, in addition to using an electronic non-contact glove, the system includes a pair of sensors, one that is placed between the eyebrows and the other on the side of the mouth. The device is also battery powered.
The sensors interpret movements and transmit the information collected to a mobile app wirelessly, which generates clear speech, thus enabling a mute person to express their thoughts to others, even though they don't know sign language. Machine learning technology has allowed the programme to master about 650 characters, including letters and numbers.
The developers claim that the current capabilities of the glove have some way to go before achieving its full potential and they promise that its commercial version will be considerably faster, smarter, and the vocabulary will grow.
Share this with your friends!