Voice assistants such as Amazon’s Alexa, Apple’s Siri and Google Assistant can be hacked by shining a laser on the devices’ microphones, according to an international team of researchers.
Dubbed “Light Commands,” the hack “allows attackers to remotely inject inaudible and invisible commands into voice assistants,” according to a statement from experts at the University of Electro-Communications in Tokyo and the University of Michigan.
By targeting the MEMS (Microelectro-Mechanical Systems) microphones with lasers, the researchers say they were able to make the microphones respond to light as if it was sound. “Exploiting this effect, we can inject sound into microphones by simply modulating the amplitude of laser light,” they wrote in the research paper.
“We show that user authentication on these devices is often lacking or non-existent, allowing the attacker to use light-injected voice commands to unlock the target’s smartlock-protected front doors, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) that are connected to the target’s Google account,” they wrote.
The researchers have shared their findings with Amazon, Apple, Google, Tesla and Ford. “We subsequently maintained contact with the security teams of these vendors, as well as with ICS-CERT and the FDA,” they said, noting that the findings were made public on “the mutually-agreed date” of Nov. 4.
The Industrial Control Systems Cyber Emergency Response Team aims to reduce the risk to America’s critical infrastructure by forging strong partnerships between government and industry.