When People Think Siri Sounds Black, Biases Often Surface

Research shows that gender and racial bias can crop up even when people interact with completely artificial robot voices

By Kaveh Waddell

For most of the past decade, digital assistants like Siri, Alexa, and Google Assistant had slightly robotic, white-sounding female voices by default. But recently, they’ve started sounding different.

Last year, Amazon’s Alexa got a male-sounding alternative voice for the first time. (That’s in addition to three Alexa voices that sound like specific celebrities.) This March, Apple gave Siri a new voice option that sounds neither traditionally masculine nor feminine. And in a previous software update, Apple released two voices that users were more likely to say sounded Black compared to the original voices, according to two surveys from a linguist at the University of Pennsylvania.

Users eager for broader representation in their everyday digital tools told CR they welcomed last year’s Siri additions. Several Black Siri users said they liked hearing a voice that sounded like a young Black person in the role of an all-knowing virtual assistant; it was validating.

But as digital assistants represent a more diverse set of gender and racial identities, research is showing that people often bring their biases to their interactions with the helpers as though they were human beings, even though the voices are computer-generated.

Nicole Holliday, PhD, an assistant professor of linguistics at the University of Pennsylvania, conducted a survey about Siri’s new voices last spring. Later she asked 485 U.S. English speakers to rate four Siri voices on several character traits: friendliness, funniness, professionalism, and competence. (The fifth, less gendered American English voice wasn’t included because it hadn’t been released yet.)

Holliday found that people’s reactions to Siri’s voices mirrored some gender and racial stereotypes. The male-sounding voice that was disproportionately characterized as sounding like a Black person was judged the funniest of the four, but participants also rated that voice less competent and less professional than the other voices.

“The voice gets the same negative stereotypes that we assign to Black men,” Holliday says.

That held true regardless of the gender or race of the listener. Black men were just as likely to rate this voice lower in professionalism and competence as white women, for example.

“We have centuries-old stereotypes that were basically constructed to project the Black man to be that way,” says Sherri Williams, PhD, an assistant professor of communications at American University who studies race and media. She wasn’t involved in Holliday’s research but adds, “These ideas about Black men are deeply ingrained in the American imagination.”

The female-sounding Siri that sounds to many people like a Black English speaker, on the other hand, wasn’t judged differently from the voices most people perceived as white. Holliday says that might be in part because women generally speak in a more “standard” way than men, which can minimize perceived differences between racial groups.

“Whenever you hear a voice and you don’t see a body attached to it, you imagine one,” says Holliday, who researches people’s language-based perceptions of others. Listeners are quick to assign a persona to Siri, even though the voices are all computer-generated. (They are, however, based on actual people’s voices, Apple has confirmed. The voices added last year, for example, were based on recordings from Black voice actors.)

Apple doesn’t label the voices with gender or racial identities, and Amazon and Google say their assistants aren’t meant to reflect a particular race or gender.

Assuming a racial or gender identity based solely on a voice is totally normal, Holliday says. Often it’s useful for navigating interactions with strangers. But it becomes harmful when someone uses the imagined identity to discriminate against a speaker.

And people’s reactions to synthetic voices could have real-world consequences.

In 2020 a trio of researchers at the Colorado School of Mines studied people’s reactions to a robot that refused to do something it was asked to do. A robot’s ability to reject human commands is important when in a high-stakes job like medicine or construction, but communicating a rejection to the robot’s operator is a delicate task, the researchers found.

Participants reacted differently when the robot rejecting the command used a male-sounding voice than when it used a female-sounding voice. When the robot used a male-sounding voice, male participants liked the robot more than before the rejection. But when the robot had a female-sounding voice, male participants liked the robot less after the refusal. The authors cited research showing that women are generally expected to be “nicer” than men as a possible explanation.

Despite the fact that people appear to apply their human prejudices during interactions with robots, experts say there’s value to digital assistants that embody different identities.

“It’s important to have a variety of representation that’s visual and auditory,” Williams says. “More representation can help disrupt stereotypical ideas that people have about others.”

Besides, it’s not practical to try to strip a synthetic voice of any gender or race markers, according to Holliday. People will automatically try to assign an identity anyway. “Our ability to classify voices like this is part of our language faculty,” she says.

Plus, Holliday says, because of the dominance of white voices in American society, a voice that’s designed not to have any racial traits will probably sound white to most listeners, the way Siri’s original voices did to a majority of participants in her studies.

Before Apple added the latest Siri voices last year, the virtual assistant defaulted to the female-sounding voice that most people identified as white in Holliday’s research. Amazon’s Alexa still defaults to a white-sounding female voice, as Google Assistant did until 2019, after which new users who bought a smart device from Google were randomly assigned a male-sounding or female-sounding voice for Google Assistant.

Google, Amazon, and Apple wouldn’t say whether they plan to add new voices that reflect a specific ethnic or racial group.

Critics have long faulted the tech giants for making their standard assistants sound like women. In 2019 a United Nations body published a report calling it a harmful example of gender bias in technology.

Since last April, Siri no longer defaults to a single voice. Instead, users are asked to choose their preferred voice when they’re setting up a new device or updating an old one.

Even if you’ve already picked a voice, you can always try out a new one. If you have an iPhone, iPad, Apple Watch, or Mac, you can choose the voice you want to assign to Siri in the Siri section of Settings or System Preferences. To change the Siri voice on a HomePod or HomePod mini, use the Home app on your Apple device.

In addition to the five American English voices, you can set Siri to speak in Australian English, British English, Indian English, or several other varieties, or in many other languages.

Google Assistant has eight American English voices, which you can test out to find your favorite. And you can swap between Alexa’s original female-sounding voice and a new male-sounding voice by telling Alexa, “change your voice.”



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2022, Consumer Reports, Inc.