They are always listening. You are on the internet. But what happens when digital assistants like Alexa get rogue? Could you share our private conversations without our consent? Privacy advocates have long warned that this might happen, and now they have.
A woman in Portland, Oregon, told KIRO7, a television news channel in Washington, that her Amazon Echo device was having a conversation and then a conversation The employees of her husband had shared in Seattle.
Skeptics quickly said that we told you when the news went through the networked world.
Now, Amazon says it knows what happened: When the woman, identified only as Danielle, chatted away with her husband, the device's virtual assistant, Alexa, mistakenly heard a series of requests and commands recording as a voice message to send one of the husband's employees.
"Echo woke up in the background with a word like" Alexa, "Amazon said in a statement." Then the following conversation was heard as a request to send messages. At this point, Alexa said aloud, "To whom?" At this point, the background conversation was interpreted as a name in the customer's contact list. Alexa then asked aloud: "[contact name] or?" Alexa interpreted the background discussion as "correct". As unlikely as this series of events is, we are exploring options to make this case even more unlikely.
[Earlier:AmazonClarification Alexa laughed at customers .]
In a subsequent interview, Danielle KIRO7 said that the echo that shared her conversation was right next to her at the time and the volume was set to seven out of 10. She never demanded her permission to send the sound, she said.
The family had several echoes in their home and used them to control the heat, the light, and the security system But two weeks ago, Daniele's husband received a phone call from the Seattle employee who said he had heard about their conversation.
"At first, my husband said, 'No, you do not,'" Danielle said KIRO7. "And he's like," You're sitting there and talking about wooden floors. " And we said, "Oh god, you really have!"
The family separated the devices and contacted Amazon, which led to an investigation.Now Danielle asks for a refund.
I'll never plug in this device again, "she told KIRO7." I can not trust it. "
If you have an echo and are concerned about what could be recorded, an Amazon help page will explain that you have the Review, listen to, and delete audio and other interactions in the Settings menu.
Amazon's main Home Assistant devices-Echo, Echo Plus, and Echo Dot-each feature seven microphones and noise-canceling technology-Amazon and Google The leading sellers of such devices.
This is not the first report on misunderstood echo commands with unusual results .Amber issued a similar statement in March after several users had reported that Alexa laughed at random times.
The assistant, according to the company, had "in rare cases" mistakenly heard "Alexa, laugh" Englisch: www.mjfriendship.de/en/index.php?op…27&Itemid=47 Command to "Alexa, can you laugh?" and let the device verbally confirm such requests.
This month, researchers from the University of California, Berkeley, said in a published newspaper that they had proven that
The researchers said that they were able to hide commands in music recordings or spoken texts were not noticed by people, but were understood by personal assistants like Apple's Siri, Google's assistant and Amazon's Alexa. 19659020]