Amazon is joining Google and Apple in taking steps to allow users to opt out of human review of its voice assistant recordings, the company announced on Friday. These changes come after many consumers railed against a practice that they saw as encroaching on their privacy, especially after a Guardian report revealed that contractors working for Apple heard cuts of drug deals and couples having sex while reviewing Siri recordings for accuracy.
News first broke in April that employees at Amazon were transcribing users’ Alexa recordings. Last month, a whistleblower told the Guardian that Apple was also listening in on smart speaker snippets, and a Google worker leaked 1,000 recordings to expose the company’s recording and reviewing practices. Human review is supposed to improve the quality of smart speakers’ responses to queries and prevent unintentional activation of the devices, according to the companies. But none of them were up front in letting users know that real people would be privy to their chats with Google Assistant, Alexa, or Siri—or to the unprompted exchanges that the assistants picked up. For example, Apple’s policy only noted that, “certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols.” Most users likely didn’t consider that “certain information” could also include intimate moments or illegal transactions.
Google suspended the practice in Europe after regulators slapped the tech giant with a three-month ban on listening to recordings, and Apple has halted voice review worldwide. It’s not clear whether these moves are permanent. Meanwhile, Amazon hasn’t officially discontinued transcription but is offering a clear way for users to opt out. The official statement from Apple explains that the company is conducting a “thorough review,” while Amazon announced it would be “updating information we provide to customers to make our practices more clear.”
Clarity has been hard to come by as Apple, Google, and Amazon have scrambled to explain themselves and adjust their policies. If you’re looking to disable smart speaker recording or opt out of human review, here’s a straightforward guide.
Amazon provided a specific set of directions to opt out of human review. To do so, first open the Amazon Alexa app and click on Settings. From there, tapping through “Alexa Privacy” and “Manage How Your Data Improves Alexa” will bring you to a screen with an updated explanation of the policy. At the bottom, unchecking a box labeled “Help Improve Amazon Services and Develop New Features” means that no recordings can be poached for human review.
Turning this off does not mean that Alexa recordings won’t still be uploaded to Amazon’s servers. To delete recordings, navigate through Settings > Alexa Privacy > Review Voice History in the app. Then you can choose to delete specific recordings or to delete through a certain timespan.
Contractors aren’t going to be listening to Siri recordings anymore, but it’s not clear if Apple will delete those that are already on its servers. Currently, the company stores clips for six months, after which time it removes user IDs from copies that could linger on the server for up to two years. The best you can do now is disable Siri. Here’s how:
In iOS, go to Settings > Siri & Search. Then turn off “Listen for ‘Hey, Siri’ ” and “Press Side Button for Siri.” You’ll get a notification asking if you want to disable Siri, where you’ll select Turn Off Siri. The last step is to go to Settings > General > Keyboard. There, turn off “Enable Dictation.”
On a Mac desktop or laptop, go through System Preferences > Siri. Then, turn off “Enable Ask Siri.” After that, return to System Preferences > Dictation. Finally, turn off “Dictation.”
To thwart Google Assistant, tap the Google Home app, then go to Account > More settings > Your data in the assistant > Voice & Audio Activity. You can deactivate voice and audio recording using the toggle on that screen.