Here’s Every Major Service That Uses Humans to Eavesdrop on Your Voice Commands

Photo Illustration by Elizabeth Brockway/The Daily Beast/Getty
Photo Illustration by Elizabeth Brockway/The Daily Beast/Getty

All of your voice assistants are listening to you—and so are the humans behind them.

A string of recent reports revealed that the world’s largest tech companies employ human moderators to listen to recordings of users issuing commands to their voice assistants.

That fact came as a surprise to many users, who reasonably assumed that big tech's sophisticated AI systems handled all of the dirty work. Many people believed the seconds-long snippets of their voices disappeared into the digital ether after being processed, either deleted or archived in a faraway, untouched server.

But that’s not always the case, and users were not pleased to find out that other humans were listening to them, sometimes when they hadn’t even invoked the robotic assistants. In response to the backlash, several tech giants paused human review of the audio snippets indefinitely and others have issued statements minimizing how much voice content was reviewed by human contractors.

Here’s a summary of what happened with each tech giant’s voice assistant and what you need to know about what happens now.

APPLE

Apple’s voice assistant Siri comes as pre-installed software on iPhones, iPads, Mac computers, HomePods, and Apple Watches. People carry these devices everywhere they go, so when a whistleblower told The Guardian in July that Apple contractors reviewed recordings of commands users had given Siri, it stood to reason that activities detailed in the snippets were intimate: drug deals, hookups, hospital visits—everything. Often the users had triggered the assistant by accident. Apple said it sent only a “small percentage” of Siri recordings to the contractors and paused human review in early August. Apple has also said that it will add a way for users to choose whether they’d like their voice recordings to be “graded” by human reviewers in a forthcoming software update. In the meantime, disabling Siri is a pretty easy precaution to take if you'd like to exercise total control over your voice commands.

AMAZON

Amazon’s Alexa, provided as the software component of the smart speaker Echo, is in competition with Siri for the title of best-known voice assistant. Families often use it to orchestrate their entire households via its many connections to smart home devices. So when it came to light that teams of Amazon workers in Costa Rica, India, Romania, and Boston listened to Alexa recordings, consumers were understandably skittish. With Amazon, contractor access was unusually powerful: the workers told Bloomberg that they often saw users’ locations simultaneously with voice recordings, and some reviewers told Bloomberg that they would share recordings they found funny in chat rooms. The good news: Alexa users can disable the voice assistant’s human auditing in Alexa’s privacy settings.

GOOGLE

Google provides the Google Assistant as part of its flagship Google app for iOS, as a software complement to its smart speaker, Google Home, and baked into the Android experience. Like the other voice assistants listed here, human reviewers grade the AI’s response to user commands, which Alphabet, Google’s parent company, said was “necessary to creating products like the Google Assistant.” The company defended itself by asserting it sent just .2 percent of Google Home commands to human reviewers but that it would pause that procedure indefinitely. As an extra precaution, Google Assistant users can disable Google’s setting for retaining voice and audio activity in Activity Controls settings. Google cautions this “may limit or disable more personalized experiences across Google services.”

FACEBOOK

Facebook has long been plagued by so far unfounded rumors that it eavesdrops on users without their permission via their phones’ microphones, so much so that Congress asked CEO Mark Zuckerberg about it at a hearing in April. People often complain on social media that ads on the social network are eerily accurate, almost invasively so.

The company maintains that it doesn’t listen to users for ad targeting, but its human reviewers did hear voice recordings people made with Facebook Messenger’s voice to text transcription feature, which users may have opted into without fully understanding the implications. Third-party contract workers told Bloomberg they had escalated concerns about the ethics of their work to their superiors because management never notified staff why the recordings had been made at all. Facebook said it had paused human review of voice recordings. As a precaution, check individual Messenger threads to be sure the option for “Automatic Voice to Text” is toggled off.

MICROSOFT

Microsoft’s voice assistant Cortana is nowhere near as ubiquitous as Alexa, but the company had other ways to listen in on users: Skype and the Xbox. All three software services employed human contractors in reviewing users’ recordings. Skype did so via its translation feature, and the Xbox did so via its trigger word “Xbox,” according to Motherboard, and via Cortana as well. Many of the Xbox recordings, the contractors said, were of children, raising privacy concerns for those under 13, who are more strictly protected by law. Rather than pausing human involvement, though, Microsoft updated its privacy policy to say that its review process may include people listening to users’ recordings.

And what of Samsung’s Bixby, which comes preloaded on the tech giant’s line of Galaxy phones? The South Korean company has refused to answer The Daily Beast’s requests for comment on human listeners for nearly three weeks.

These aren’t the first stories revealing tech’s proverbial man-behind-the-curtain. Many content moderation algorithms on social networks rely on humans to make decisions to signal to the artificial intelligence what to flag as inappropriate. Training any algorithm requires huge datasets, and that often requires armies of people to label data. Large tech companies often lean on third-party contract workers for these efforts—a trend users are hearing more and more about in recent years.

As is often the case when it comes to privacy, making preventative choices in the privacy settings of your most-used apps whenever possible is the best way to insulate your own private conversations from prying, non-AI ears while tech companies bring their practices up to snuff.

Read more at The Daily Beast.

Get our top stories in your inbox every day. Sign up now!

Daily Beast Membership: Beast Inside goes deeper on the stories that matter to you. Learn more.