A joint team of specialists from the University of Michigan and the Japanese University of Electrical Communications developed an attack technique called Light commands, which allows you to use the laser to send commands to smart devices, directing the laser to MEMS microphones.
The acronym MEMS stands for Microelectromechanical systems or "microelectromechanical systems" (MEMS). The Light Commands attack is based on the fact that the MEMS microphones, which are used in many modern devices, are so sensitive that they can respond to light sources, such as a laser, in the same way they react to sound waves. As a result, the researchers were able to use this feature to send signals to the microphone and inaudible commands to assistants Siri, Google Assistant, Facebook Portal and Alexa by modulating the intensity of the laser radiation.
Experts write that in this way an attacker can deliver arbitrary audio signals to a vulnerable device by directing light directly into the microphone hole. Systems with voice control, as a rule, do not require users to authenticate, so an attack can be carried out without a password or PIN (however, researchers also do not rule out the possibility of brute force). But it is worth remembering that such devices, when executing a command, usually respond with voice and visual signals, which means that nearby people can notice the attack.
“An attacker can use voice commands through light injections to unlock the front door, protected by a smart lock, open garage doors, make purchases on sites at the expense of the target, or even find, unlock and start various vehicles (for example, Tesla and Ford), if they connected to the target Google account, ”the authors of Light Commands say.
In fact, the only serious limitation for Light Commands attacks is the need to stay within sight of the target device and very accurately direct the light to the microphone. During the experiments, the experts managed to conduct Light Commands attacks from a distance of up to 110 meters, including those located in another building, opposite the room with the target device, or at the other end of a very long corridor.
Moreover, not the most expensive equipment was used to carry out the attacks. So, one of the setups described in the report included a laser pointer (price of $ 18 for three pieces), Wavelength Electronics LD5CHA ($ 339) and an audio amplifier ($ 27.99). You can also optionally use an additional telephoto lens ($ 199.95) to more accurately focus the laser during an attack over a long distance. The researchers also tested an infrared laser, which is invisible to the human eye and can be used for more hidden attacks, and a powerful fluorescent light.
Researchers describe several software and hardware defenses that manufacturers could counter such attacks. Since the main problem is the lack of user authentication, devices, for example, can ask owners a simple question before executing commands. However, this will clearly annoy users. Also, experts advise manufacturers to more reliably shield microphones, protecting them with special grids and “curtains”, which will prevent light from entering, but not real sound waves. In addition, smart speakers and other similar devices usually use several microphones, which means that you can “teach” them not to trust commands if only one microphone out of many receives a signal.
A demonstration of Light Commands attacks in practice, including opening a garage door and giving a command from another building, can be seen through the window in the videos below.