The laser study was conducted by researchers at the University of Electro-Communications in Tokyo and the University of MI, who detail their work in a new paper, "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems".
"If you have a laser that can shine through windows and across long distances - without even alerting anyone in the house that you're hitting the smart speaker - there's a big threat in being able to do things a smart speaker can do without permission of the owner", said Benjamin Cyr, a graduate student at the University of MI and a paper coauthor, according to the report. This would trick the device's voice assistant into responding to the light that hit the microphone's membrane as if it were sound.
The lasers basically trick the microphones into making electrical signals as if they're hearing someone's voice, they noted. Researchers from UEC Tokyo and the University of MI have demonstrated one approach to hacking smart speakers using lasers.
"Once an attacker gains control over a voice assistant a number of other systems could be open to their manipulation", a breakdown of the study on the University of Michigan's website says.
The report lists several devices that are susceptible to Light Commands and points out that "While we do not claim that our list of tested devices is exhaustive, we do argue that it does provide some intuition about the vulnerability of popular voice recognition systems to Light Commands".
So how did the researchers manage to control these devices with just a laser?
The vulnerability can be used to break into other systems that are connected through a smart speaker like smart home devices, connected garage doors, online shopping, remotely starting some vehicles, and more.
Business Insider reports that a team of researchers from Tokyo's University of Electro-Communications and the University of MI claim to have discovered a way to "hijack" voice-enabled devices by shining a laser at the microphones of the devices.
Spokespeople for Google and Amazon said the companies are reviewing the research and its implications for the security of their products but said risk to consumers seems limited. Amazon did not respond to a request for comment at the time of publication. It allows the researchers to issue nearly any command they like.
The technique requires the laser to actually hit the target device's microphone port, which could get significantly more hard as the distance gets larger. Infrared lasers also work in some cases, allowing for a light that's invisible to the human eye to be potentially used in stealthy hacks. Voice-command devices also generally give audible responses, but an attacker could still change the device's volume to continue operating it undetected.