The sensitivity of microphones installed in smart speakers turned out to be high enough to transmit commands to them using a laser beam.
Engineers from the University of Michigan and Japanese Telecommunications University have developed a laser attack method for smart speakers. The method, called Light Commands, exploits the shortcomings of MEMS microphones installed in many smart speakers.
The problem lies in high sensitivity of such microphones, which respond not only to sound vibrations, but also to light sources, such as lasers. The researchers managed to develop a method for transmitting inaudible commands to speakers using a laser beam.
Smart speakers with Siri and Alexa assistants and Google Assistant can be attacked from a distance of up to 110 meters. In fact, this is the only limitation for potential hackers. Most of such gadgets are not protected by passwords, and therefore, where a direct visual contact is possible, commands can be sent to devices to open locks in a smart home or unlock an electric car in a garage.
However, such an attack may not be carried out, if the owner of a smart speaker is somewhere around, since voice assistants, before executing a command, usually say out loud that they have understood and are setting about the task.
Share this with your friends!