Apple Suspends Listening to Recordings of Siri Users

Consumer Reports has no financial relationship with advertisers on this site.

Consumer Reports has no financial relationship with advertisers on this site.

Update: On August 28, Apple announced it will stop retaining recordings of users' interactions with Siri by default, but will continue to store computer-generated transcripts of those interactions. Consumers will be able to opt in to allow the company to keep audio samples to help improve Siri's capabilities, but those recordings will be analyzed by employees rather than contractors, as in the past. The company says it will delete recordings that were triggered inadvertently. Apple said that change will take effect in the fall. This article was originally published on August 2, 2019.

Apple, responding to privacy concerns, has temporarily stopped having humans monitor recordings of consumer interactions with Siri in order to improve the digital assistant's performance.

The company also plans to introduce new tools to give users more information and control over how Apple handles the data.

The change follows an investigation from the Guardian last week about the intimate details of consumers' lives that are often exposed to the Apple contractors who review Siri recordings, including "confidential medical information, drug deals, and recordings of couples having sex."

"We are committed to delivering a great Siri experience while protecting user privacy," an Apple spokesperson told Consumer Reports by email. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

Apple declined to comment about whether the company will make any changes to how long it keeps recordings of users' interactions with Siri.

When you use a digital assistant, the interaction may feel like a private, one-to-one affair. But by default, recordings from Alexa, Google Assistant, and Siri devices are kept indefinitely and sent to corporate servers.

Recently, details have emerged about how those recordings often make their way into the hands of human beings, who review, transcribe, and annotate the contents to improve voice recognition systems and other technology.

The practices have drawn criticism from privacy advocates, particularly because the devices can be triggered accidentally when users don't intend to activate them.

Users of Amazon Alexa and Google Assistant already can review and delete these recordings, as well as adjust privacy settings to prevent the audio from being sent to a human reviewer. Until now, Siri's lack of privacy controls made Apple an outlier among the major companies that operate digital assistants.

"We are glad to see Apple provide controls where they haven't before, but the company needs to pair their goal of protecting privacy with reasonable transparency about its practices," says Katie McInnis, policy counsel at Consumer Reports. "Apple has tried to compete as the more privacy-focused company in marketplace, but this example serves to show how hard it is for the average consumer to really know which product or company will protect their digital privacy best."

Apple's decision followers a similar move from Google after a leak of audio recordings from Assistant users in the Netherlands was reported in July. Google announced a global pause of its "language reviews" to investigate the issue, which is still in effect, a Google spokesperson told Consumer Reports by email.

In a statement to Consumer Reports, an Amazon spokesperson said: “For Alexa, we already offer customers the ability to opt-out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We’ll also be updating information we provide to customers to make our practices more clear.”

How to Protect Yourself

Apple users don't need to make any changes for now. The company says it has temporarily suspended the practice of human reviewers going over Siri voice recordings, and Apple has promised to introduce new controls in a future update. However, keep in mind that there's still no way for Siri users to erase recordings of their voices without deleting all of the data associated with an Apple account. That may be coming down the the line.

No matter what kind of digital assistant you use, an easy solution is to prevent devices from listening in the first place by muting them—though that will make it less convenient to use smart speakers and other products. For details about how to mute Alexa, Google Assistant, and Siri devices, check Consumer Reports' guide to smart speaker privacy.

If you're an Amazon Alexa or Google Assistant user, you can follow the steps below for an additional privacy boost.

To keep human employees from listening on Alexa devices: Open the Alexa app on your smartphone and tap the menu button on the top left of the screen > Settings > Alexa Account > Alexa Privacy > Manage How Your Data Improves Alexa.

Turn off the button next to Help Develop New Features. Then turn off the button next to your name under Use Messages to Improve Transcriptions.

To review or delete your recordings from Alexa devices: Open the Alexa app on your smartphone and tap the menu button on the top left of the screen > Select Settings > Alexa Account > Alexa Privacy > Review Voice History. That will show you the recordings, which can be searched for a keyword or sorted by date.

There’s a tab to Delete All Recordings for Today.

To delete all of the recording on the device, tap Date Range > All History. Erasing all the recordings is tantamount to resetting the unit, so there might be some reduction in its ability to recognize your voice. Note also that this setting affects recordings already made; it won’t keep the speaker from recording you in the future.

You can also make these adjustments using a web browser through the Alexa Privacy Hub.

To keep human employees from listening on Google Assistant devices: From any Google website, click the icon in the top right (you'll need to sign in first) > Google Account > Manage your data & personalization > Voice & Audio Activity > Switch the toggle off > Pause. (These instructions are for a computer, but the steps are similar on a mobile device.)

To review or delete your recordings from Google Assistant devices: From any Google website, click the icon in the top right > Google Account > Manage your data & personalization > Voice & Audio Activity > Manage Activity. (Again, these instructions are for a web browser on a computer.)

You can search by keyword or by date, and delete recordings individually or in groups. The most privacy-friendly option is to delete all your activity with a single command. From the column on the left, select “Delete Activity By.” Select “All time” from the first menu, and then choose “Voice & Audio” from the products menu.

Erasing all the recordings functionally resets your assistant, which might reduce its ability to recognize your voice. Note also that this setting affects only recordings already made and won’t keep the speaker from recording you in the future.



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2019, Consumer Reports, Inc.