Imagine you're sitting at a restaurant or a cafe and for some reason you take out your phone and keep it on the table, the next thing you know, your phone's been hacked. Sounds highly unlikely, but it's definitely not something that's impossible.
Joining the ever-growing list of ways to hack smartphones is a new technique that uses your smartphone's ability of hearing ultrasonic waves that are otherwise inaudible to human ears to its advantage. The new hacking technique uses this ability to hack voice assistants such as Google Assistant and Siri on smartphones.
Works on tables made of wood, glass and metal
Researchers at the Washington University in St. Louis, Missouri used guided ultrasonic waves to trick the virtual assistants into performing various actions, including placing phone calls, taking pictures, adjusting the volume and even retrieving passwords via text messages. What's interesting about this is that the smartphones were simply placed on tables, with the ultrasonic waves being transmitted through the solid surface.
SurfingAttack, as the technique is called, worked on tables made out of wood, glass and metal and although it works on plastic tables too, the results were a bit unreliable.
It was initially developed by a team of researchers from Michigan State University, the Chinese Academy of Sciences, the University of Nebraska-Lincoln, and Washington University, St. Louis, Mo.
How does a SurfingAttack work?
The technique requires some special hardware, but the most important piece of technology is an off-the-shelf piezoelectric transducer that can be purchased for as little as $5. The researchers attached a microphone to hear the assistant's response and a piezoelectric transducer that generates vibrating sounds that are inaudible to human beings, to the bottom of a table.
The setup can also be mounted to a thin piece of glass or metal that can be hidden under a tablecloth, and the transducer can impart the vibrations causing the material to generate the inaudible sounds which can be words or instructions that the sensitive microphones on the smartphone can easily detect. They can then trigger the device's smart assistant if it's been set to response to voice commands such as "Hey, Google" or "Hey Siri". Meanwhile, the team had a waveform generator nearby to generate the relevant signals, with a laptop running the attack software.
Attack worked on different phones from four different manufacturers
The researchers tested a total of 17 smartphones and found that the attack worked on 15 of the devices from four different manufacturers. The phones included the Pixel 1, Pixel 2 and Pixel 3 from Google, Motorola's Moto G5 and Moto Z4, Samsung Galaxy S7 and Galaxy S9, Xiaomi Mi 5, Mi 8 and Mi 8 Lite and Apple iPhone 5, 5s, 6 Plus and iPhone X. They also noted that the attack worked on phones that had silicon cases too. The fact that it worked on different phones running different OS and from different brands suggests that it is highly possible that it could work on other phones too.
Phones with curved backs
The team also tested the Huawei Mate 9 and Samsung Galaxy Note 10 Plus. They found that the phones weren't susceptible to the hack, and it is thought that the reason for their "immunity" against a SurfingAttack could be the phones' curved rear panels which reduce the surface area that is touching the table. Although some materials work better than others at conducting ultrasonic signals, the researchers used an aluminium plate and found it more helpful in pulling off a successful SurfingAttack from a distance of 30 feet, which also allowed them to keep the other required equipment hidden out of view.
Why should you worry?
Now you may be wondering, why would a hacker want to hack into my phone and ask the assistant for the weather? Well, unfortunately the smart assistants in our phones have become so well integrated into the phone's OS that the hackers can do a lot of damage using them. For example, they can use them to place long-distance calls and quickly rack up high bills and also hijack some confidential emails and text messages which could potentially give them access to verification codes linked to your bank account if your phone number is used as part of a two-factor authentication method.
If the technique gets into the wrong hands, a hacker could potentially siphon off personal information stored in a phone and hijack smartphone gadgets and more.
Also, the SurfingAttack relies on hidden microphones to listen for responses from the target devices' voice assistant and these responses can be heard by the hacker using an earphone. The hacker can also pass commands to lower the smartphone's volume in order to make it inaudible to someone in an office and they may not even notice their device having "one-way" conversations. So, it's quite possible for the jack to go unnoticed for several minutes until the damage is already done.
Although the complexity involved in a SurfingAttack makes the attack a highly unlikely candidate for showing up in the wild, we should never take chances. The fact that a digital assistant can be controlled without your notice or actual user input is a solid reminder to take stock of your phone's settings.
Protect yourself from SurfingAttack
As far as safeguarding your smartphone from a SurfingAttack is concerned, luckily the researchers found several ways to thwart the attack. Perhaps, the easiest one is to simply disable your smart assistant's always-on listening feature, requiring it to be invoked manually, because if your phone isn't waiting to hear a voice assistant command, it simply cannot be hijacked.
They also recommend that you turn off lock screen personal results on Android, which will require you to unlock the device before your Google Assistant can look for results on your behalf and access other personal info.
The researchers also found that thicker tablecloths were good at muffling ultrasonic signals to the point where the voice commands were longer understood by the device. This also suggests that a beefier smartphone case, which is designed to protect your phone against falls and impacts could also offer similar protection.
You could also consider removing personal results from the list of items that your assistant can access when it is invoked and review your Google activity history log to see what commands your Google Assistant has carried out.