The team has published an article describing the light flare after seven months of experimentation. They were able to hijack intelligent speakers at a distance of 230 to 350 feet by focusing lasers with a telephoto lens. In fact, the Google Home, which opened a garage door, was in a room in another building. The laser modulation that they beamed through the window at their microphone port is equivalent to the voice command "OK, google, open the garage door."
They explained that there is a small plate called the diaphragm in the microphones of the devices, that moves when they are hit by them sound. Lasers can emulate this motion and convert it into electrical signals that the device can understand. They said opening the garage door by taking over Google Home was easy, and they could easily make online purchases, open doors protected with smart locks, and even open cars connected to devices with voice AI using the same method can unlock remotely.
The researchers have already briefed Tesla, Ford, Amazon, Apple, and Google about the issue ̵
This is far from the first vulnerability security researchers have discovered for Digital Assistant. Researchers at Zheijiang University in China concluded that Siri, Alexa and other language assistants can be manipulated with commands sent in ultrasound frequencies. Meanwhile, a group from the University of California, Berkeley, found that they can handle intelligent speakers by embedding commands that are inaudible to the human ear directly into recordings of music or spoken text.