Alexa users who do not want their third-party records Finally, there is an option to disable, thanks to a new Amazon policy that has been implemented in increasing criticism of the company and its voice assistant competitors Apple and Google.
This policy came into effect on Friday, Bloomberg reported, adding a new disclaimer on the human verification option and the ability to toggle permissions in the Alexa app's settings menu.
Unfortunately, Amazon has never made it very easy to disable data collection on its devices, and this new policy does not contradict this trend. According to Bloomberg, users need to go into their settings menu, then navigate to "Alexa Privacy" and finally tap on "Manage how your data improves Alexa" to see the following text: "With this setting, your voice recordings can be used to develop new features and manually check to improve our services. Only a very small part of the voice recordings is checked manually.
Previously, customers could only deny their permission to use their records to develop new device features. An Amazon spokesperson told Gizmodo that selecting this option also got them out of the race for a "manual check". However, the fact that strangers might be able to analyze your Alexa request has not been explicitly mentioned in either this setting or the language assistant's terms and conditions.
We know that Amazon contractors have been listening to Alexa recordings since at least April The company has been silent on policy adjustments so far, even as Apple and Google suspended the process after similar messages came through their own language assistants. In the latter case, one of these contractors passed more than a thousand assistant notes to a Belgian news site last month, prompting Google to cease the practice of a European data protection officer, Tech Crunch reported.
That's not to say The human review of these recordings by language assistants goes the way of the dodo. Companies like Amazon, Apple and Google are making progress on artificially intelligent software. But as Gizmodo has already reported, the technology still lacks the sophistication needed to completely lose the Human Management training wheels. However, these controversies over the last few months seem to be driving businesses to be more transparent about what's going on, so users can at least decide whether to return their Alexa requests beyond the device.
Gizmodo approached Amazon with questions about the new policies, and we will update this post if the company replies.
Update 1:15 pm: An Amazon spokesperson told Gizmodo in an email that the company added a "new language" to the FAQ page of their language assistant. After a quick comparison with the page's appearance on the Wayback machine in July, it looks like this description already exists. The FAQ page is now much more detailed in answering the question of how to train the voice recordings of a user Alexa:
"This training is based in part on supervised machine learning, an industry-standard practice where people have a very small sample of texts read inquiries to help Alexa understand the proper interpretation of a request and provide the appropriate answer in the future. "
Alexa's previous FAQ page did not mention a human verification process. "Our supervised learning process includes several privacy protection measures to protect our customers," states the new page. It then explains how to switch permissions for contractors to review your records.
The same speaker also repeated the following statement made for our previous coverage of Alexa's voice recordings and contractors:
For Alexa, we already offer customers the ability to use their voice recordings to develop new Alexa features deactivate. The voice recordings of customers using this deactivation are also excluded from our supervised learning workflows, which manually check an extremely small sample of Alexa requirements. We also update the information we provide to our customers to help us clarify our practices. "