The practice of improving the quality of Siri was reviewed after The Guardian reported last month that contractors could hear users' private conversations. Apple  ( AAPL ) initially responded by temporarily suspending the practice earlier this month as the company reviewed it, deciding that their recordings would be listened to by human reviewers, instead of being the default. And only Apple employees can listen to audio examples of Siri interactions, not contract workers.
The company also announced that audio recordings of users' interactions with Siri will no longer be saved.
"We know that as part of our Siri quality assessment process, customers were concerned by reports that people were listening to audio Siri recordings," the paper said. "As a result of our review, we find that we have not fully lived up to our lofty ideals and apologize for it."
Apple is not the only company forced to rethink its approach to reviewing user records for privacy reasons. Google togetL ) temporarily suspended human verification of his records and Amazon ( AMZN ) recently changed its name Settings to make it easier for people to avoid checking the Alexa recordings. Facebook ( FB ) Also, checking some users' audio clips has interrupted real people to check recordings from voice assistants, which are often unknown to the user. The coverage has clearly reminded that many consumer tech products are not only supported by faceless algorithms and artificial intelligence, but require a human touch to improve people to hear some parts of conversations for voice-controlled technology to work. However, experts also said that technology companies should do more to clarify what happens to the records on these systems and what privacy risks exist.
For Apple is particularly much at stake. Apple has repeatedly sought to position itself as a privacy-driven company to create a clear contrast to competitors such as Facebook and Google. Apple CEO Tim Cook commonly designates privacy as " basic human right