How to hack Alexa: a guidance from researchers

Early Black Friday Shopping At A Target Store

Early Black Friday Shopping At A Target Store

The researchers were able to successfully demonstrate how to use light to inject malicious commands into several voice controlled devices over long distances as well as through glass.

As if listening to you have sex wasn't bad enough, Siri and Alexa can also be hijacked by LASERS, researchers find Voice-activated digital assistants can be remotely hijacked by lasers as far as 350 feet away and made to order products, start cars, and otherwise drive a smart-home owner insane, researchers have discovered.

This ability for criminals to "talk" to digital voice assistants by shining a laser at the device's microphone, will be a cause for concern for homeowners, depending on automated their property is. Once the attacker gets access and control over a voice assistant, other systems including smart home switches, smart garage doors, etc, can be broken into by the attacker.

The researchers don't now know how the microphones actually interpret light as sound, but nevertheless, it happens, opening up a new way in which such smart home devices and virtual assistants could be hacked and exploited.

By calibrating the lasers to match the frequency of a human voice, the boffins were effectively able to beam commands to a selection of smart speakers as well as an iPhone and a pair of Android devices.

California discloses probe on Facebook privacy practices
The responses the state has received so far to its questions are "patently inadequate", Becerra said in a statement. But we must move our investigation forward.

A video of Google Home speaker being tricked into opening a garage door can be found here. Smart speakers typically don't come with any user authentication features turned on by default; the Apple devices are among a few exceptions that required the researchers to come up with a way to work around this privacy setting. Amazon this year admitted in a letter to a U.S. senator that it keeps Alexa user voice recordings indefinitely, and it is being sued for over allegations that Alexa-powered smart speakers are recording children. Amazon is sitting on top of this market: Canalys reports Amazon shipped a quarter of these speakers, or an estimated 6.6 million between April and June.

Just five milliwatts of laser power-the equivalent of a laser pointer-was enough to obtain full control over many popular Alexa and Google smart home devices, while about 60 milliwatts was sufficient in phones and tablets. The researchers said that distance was the longest area they could use (a hallway) when conducting tests. The researchers explain that microphones convert sounds into electrical signals.

The researchers noted that they haven't seen this security issue being taken advantage of. The tool can make smart speakers, smartphones, and tablets perform numerous tasks even from hundreds of feet distance. It's not clear if that vulnerability was ever fixed - filtering out those frequencies may not be possible with current microphone design - but unlike the laser hack, the ultrasonic hack was only possible in close proximity to the device. Takeshi Sugawara, a visiting scholar at the University of MI and the paper's lead author, said one way to do this would be to create an obstacle that would block a straight line of sight to the microphone's diaphragm.

Gekin said he contacted Google, Apple, Amazon and other companies to address the security issue. Amazon did not respond to a request for comment at the time of publication.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.