A joint group of specialists, including experts from the University of Michigan, the University of Washington in St. Louis, the Academy of Sciences of China and the University of Nebraska-Lincoln, demonstrated to the world their development called Surfingattack. This attack allows you to remotely control the virtual assistants Google Assistant and Apple Siri using indistinguishable by the human ear ultrasonic commands.
Researchers explain that in essence, voice assistants listen to a much wider frequency than the human voice can reproduce. Because of this, they can respond to ultrasonic vibrations and interpret them as voice commands. As a result, an attacker gains the ability to seamlessly interact with devices using voice assistants, intercept two-factor authentication codes and make calls.
To transmit malicious ultrasonic signals, the new attack uses the non-linear nature of MEMS microphone circuits. To do this, the experts used a simple piezoelectric transducer for $ 5, which was attached to the surface of the table. To hide what was happening from the victim, the researchers adjusted the volume of the responses of the attacked device using a directional ultrasonic command, but they themselves still had the opportunity to record the assistant’s voice responses through a hidden “bug” located under the table.
At the same time, experts assure that the problem is not limited to just one table: such attacks can theoretically work at a distance of up to 9 meters (30 feet). Below you can see a real example of this using a large aluminum plate.
Experts tested SurfingAttack on various devices, including the Google Pixel, Apple iPhone, Samsung Galaxy S9 and Xiaomi Mi 8. They were all recognized as vulnerable. In addition, it was found that the attack works regardless of the surface material of the table (for example, metal, glass, wood) and the configuration of the phone.
Only two devices failed to attack: Huawei Mate 9 and Samsung Galaxy Note 10+, the first of which became vulnerable only under the control of LineageOS. Experts attributed these unsuccessful attempts to “the structure and materials of the phone.” Also in front of SurfingAttack were smart speakers Amazon Echo and Google Home.
Let me remind you that this is not the first such attack, and not the first time that specialists use ultrasound in this way. The most striking example of this kind is DolphinAttack, which was also aimed at tricking voice assistants through "silent" ultrasound commands. You can still recall similar studies BackDoor and Lipread, as well as a similar Light Commands attack technique, which also allows you to quietly send commands to smart devices, but it uses not ultrasound, but a laser.