Lasers can silently issue 'voice commands' to your smart speakers

Amazon Echo Amazon Echo laser hack Laser to hack smart devices Laser to hack Google Home Google Home security Google Home security

Hackers can hijack your Google Home or Amazon Echo with freakin' laser beams

The laser modulation they beamed at its microphone port through the window is equivalent to the voice command "OK Google, open the garage door".

The laser study was conducted by researchers at the University of Electro-Communications in Tokyo and the University of MI, who detail their work in a new paper, "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems".

"If you have a laser that can shine through windows and across long distances - without even alerting anyone in the house that you're hitting the smart speaker - there's a big threat in being able to do things a smart speaker can do without permission of the owner", said Benjamin Cyr, a graduate student at the University of MI and a paper coauthor, according to the report. This would trick the device's voice assistant into responding to the light that hit the microphone's membrane as if it were sound.

We knew Google and Amazon listen to their users through their voice-activated Echo and Home smart speakers.

The lasers basically trick the microphones into making electrical signals as if they're hearing someone's voice, they noted. Researchers from UEC Tokyo and the University of MI have demonstrated one approach to hacking smart speakers using lasers.

"Once an attacker gains control over a voice assistant a number of other systems could be open to their manipulation", a breakdown of the study on the University of Michigan's website says.

The report lists several devices that are susceptible to Light Commands and points out that "While we do not claim that our list of tested devices is exhaustive, we do argue that it does provide some intuition about the vulnerability of popular voice recognition systems to Light Commands".

So how did the researchers manage to control these devices with just a laser?

The vulnerability can be used to break into other systems that are connected through a smart speaker like smart home devices, connected garage doors, online shopping, remotely starting some vehicles, and more.

Business Insider reports that a team of researchers from Tokyo's University of Electro-Communications and the University of MI claim to have discovered a way to "hijack" voice-enabled devices by shining a laser at the microphones of the devices.

Smart speakers like Google Home (Nest), Apple HomePod, and Amazon Echo are constantly listening using local audio processing, but they only "wake up" when someone says the trigger phrase.

Spokespeople for Google and Amazon said the companies are reviewing the research and its implications for the security of their products but said risk to consumers seems limited. Amazon did not respond to a request for comment at the time of publication. It allows the researchers to issue nearly any command they like.

The technique requires the laser to actually hit the target device's microphone port, which could get significantly more hard as the distance gets larger. Infrared lasers also work in some cases, allowing for a light that's invisible to the human eye to be potentially used in stealthy hacks. Voice-command devices also generally give audible responses, but an attacker could still change the device's volume to continue operating it undetected.

Latest News