It is well known that technological products are constantly being subjected to security breaches, thanks to researchers who search for security holes in the various devices in order to perfect them. As far as we know, some researchers have discovered the way to hack into several products thanks to a laser.
This new trick is based on a laser that can allow us to remotely hack devices that include MEMS (micro-electromechanical systems) microphones such as the iPhone, HomePod, Google Home or Amazon Echo.
Ars Technica has published a report explaining that this type of attack is called ‘Light commands’ discovered by the University of Michigan and the University of Electro-Communication . It would allow anyone to execute Siri or Assistant or Alexa commands from a distance with the only condition that there is a clear view of the receiver.
As explained in the source consulted, the use of a low power laser allows hackers to execute commands of their choice on voice-activated systems such as the HomePod. This attack system is quite useful because you don’t have to be relatively close to the device as works from a distance of 110 meters and can go through the glass of a window.
The security problem is that voice assistants do not ask us to authenticate in most cases with a password or PIN. And if they do, the hacker could bypass this security protocol thanks to brute force. This is why if we have the HomePod in our house near a window, someone with a laser can execute commands and access our personal information.
Obviously there is a major limitation in being able to perform this attack as you need a laser and to have vision with the speaker, but the worry here is all that the smart speaker controls. With access to this it is possible to unlock the intelligent door lock, which is already a serious problem.
In order to solve this problem, these researchers have created a website that we leave you here, where they explain the detailed functioning of this type of attack. In addition, a are collaborating with different companies such as Apple, Google or Amazon in order to apply some kind of solution to this problem.
Leave us in the comment box what you think about this vulnerability in devices with MEMS microphones.