قالب وردپرس درنا توس
Home / Technology / Amazon could have a serious Alexa problem on his hands – BGR

Amazon could have a serious Alexa problem on his hands – BGR



Earlier this week, news broke that Amazon's Alexa assistant had recorded a private conversation between two people and then sent that recording to a third party. Of course, Alexa should listen to everything you say, but only act when you say the given hotwords that call the assistant. Only then should Alexa follow your orders and send messages to others, if that is one skill you have activated on your Echo Smart speakers.

Amazon explained what went wrong with this particular echo speaker and everything sounds like a series of unfortunate events. It also sounds like Amazon could have a real Alexa repaired.

Amazon declared Recode which caused this privacy-violating incident. Here's what happened ̵

1; we divided Amazon's statement into all the steps Alexa went through to send the message:

Echo woke up in the background, sounding like "Alexa" due to a word.

Then the following conversation "19659002" At this point, Alexa said loudly, "To whom?" At this point, the background conversation was interpreted as a name in the customer's contact list.

Then Alexa loudly asked, "[contact name] right?" Alexa then interpreted the background conversation as "correct".

As unlikely as this chain of events is, we are exploring options to make this case even more unlikely.

All This sounds very unlikely, but it also explains exactly what happened, as a reminder, the woman spoke to her husband and a partial recording of her conversation was then sent to one of his employees, who lives in another state

It's always possible that one of the two people in the chat said a word that sounded like Alexa and triggered a series of events as described above, and they may have mentioned a name, as well as the name of the employee The man sounded and used words that could be interpreted as a confirmation of sending a message.

But, no matter how one looks at it, that's a serious problem. Apparently Alexa can misinterpret his own hotword, that's definitely not something that you want from the assistant.

Also, notice that Amazon does not say how Alexa got the chat started, I mean, she has some of those words r somehow interpreted as a command to record the chat. Amazon's explanation jumps from the Alexa Invocation directly to the news broadcast section.

Another problem to keep in mind is that the couple never heard Alexa's answers. I mean, when you hear Alexa talking to you without calling her, you stopped clapping and saw what it was all about. Is it possible that Alexa was too far from the conversation to properly identify words and be heard? The original report says that echo speakers were installed in every room in the house. The woman said the device had never told them that she was preparing to send the shot.

On the other hand, this is not a common occurrence, or the net would be filled with similar stories. But if you want to make sure that does not happen to you, just avoid sending any news about Alexa. I know, that's extremely annoying, right? Do you know what's even more annoying? You must contact Amazon Customer Support to disable the feature. For instructions, see BuzzFeed .


Source link