Experts track rising hate speech on Twitter since Musk's takeover

The Twitter logo against a silhouette of Elon Musk’s profile.
The Twitter logo and Elon Musk’s profile. (Photo illustration: Yahoo News and Dado Ruvic/Reuters; photo: Reuters)

Experts have reported a rise in hate speech on Twitter since billionaire entrepreneur Elon Musk acquired the social media giant in October.

“I think there clearly is a shift in the sort of content,” social media consultant Matt Navarra told Yahoo News. “We’re seeing people who previously felt that they might get banned or suspended when Twitter was not owned by Elon Musk, and now coming out and testing the boundaries of whatever policies or decisions that Elon Musk is putting in place for moderation.”

In the 12 hours after Musk became CEO, researchers tracked 4,778 instances of hate speech on Twitter. The average of 398 incidents per hour was more than four times the highest pre-Musk average of 84 per hour. They also found that the use of the N-word increased by over 500%.

Elon Musk onstage holding a microphone.
Musk in Norway in August. (Carina Johanson/NTB/AFP via Getty Images)

But according to the microblogging platform, Twitter does not tolerate hate speech and provides consequences for those who participate in such actions.

“You may not promote violence against ... other people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease. We also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories,” Twitter’s help center states.

But even Musk himself has contributed to the hate speech and misinformation on the platform. Vice news recently reported that he “shared a picture of a white supremacist who said he’d like Trump to be more like Hitler; failed to prevent users from posting videos of the Christchurch massacre; tweeted a popular alt-right meme; used a known antisemitic trope; and, inadvertently or not, shared a dogwhistle that white supremacists interpreted as praise for Hitler.”

In addition, on Oct. 30 Musk tweeted an article that contained several wild inaccuracies about the attack on the husband of House Speaker Nancy Pelosi. He has since deleted it. Many of the CEO’s tweets that have shared hate speech and misinformation have been deleted, but the content lives on forever in screenshots.

“Once you put that out there, the world has seen it and you’ve sent the message. And there’s really no unbreaking of the egg, the egg is broken,” Bond Benton, an associate professor of communication at Montclair State University in New Jersey, told Yahoo News.

House Speaker Nancy Pelosi and her husband, Paul Pelosi, at the 45th Kennedy Center Honors in Washington, D.C.
House Speaker Nancy Pelosi and her husband, Paul, at the Kennedy Center Honors in Washington, D.C., on Sunday. (Saul Loeb/AFP via Getty Images)

As Twitter sees a rise in hate speech, experts say Musk’s tweets are not setting the right example. “His account is pushing the limits,” Navarra said.

Experts fear that Musk’s tweets are inciting further hate speech and are influencing the nearly 400 million users on the platform.

“It’s probably not a good thing to be the owner and CEO of a social media platform — and then you’re leading by example, setting the tone of the platform — posting some of the stuff that Elon Musk has posted in recent weeks,” Navarra said.

Since Musk’s acquisition of the platform his influence has grown, and experts predict he will become the No. 1 influencer on Twitter by January. He is currently the second-most-followed Twitter user, with 120 million followers.

Social Blade, a site that tracks social media analytics, said in late November that Musk had gained 268,303 followers a day on average since taking over, and that he had been posting 84% more often since acquiring the company. In addition to his influence, he has also opened the door for those who had been previously banned from Twitter for using their accounts to spread bigotry or misinformation, or to encourage violence.

“It’s dangerous because Elon Musk made the decision to reinstate people who specifically violated their stated plans using hate speech around trans people specifically, those are the first people that he reinstated — people like Jordan Peterson [and] Babylon Bee,” Michael Edison Hayden, spokesperson at the Southern Poverty Law Center, a nonprofit that monitors hate groups, told Yahoo News.

Last month, Musk announced he would provide “general amnesty” to Twitter accounts that were previously suspended for violent threats and misinformation.

This content is not available due to your privacy preferences.
Update your settings here to see it.

In November, former President Donald Trump’s Twitter account was reactivated. He had been banned in January 2021 after his posts incited a violent insurrection at the U.S. Capitol. And on Nov. 20, Musk welcomed hip-hop artist Kanye West, now known as Ye, back onto the platform, after he was restricted for posting antisemitic comments.

Shortly thereafter, West repeated hate speech on the platform and was suspended for posting an image of a swastika inside a Star of David.

Benton says those who are reinstated on the platform will probably push the envelope. “The fact that they’ve been reinstated almost sends a message in some ways that the content that they put out before was actually not that bad,” he said.

Yahoo reached out to Twitter for comment, but the request was unanswered. However, Musk recently tweeted that hate speech is declining on the platform and that @TwitterSafety will publish updates on the issue weekly.

This content is not available due to your privacy preferences.
Update your settings here to see it.

Recent tragedies may also have fueled a rise in hate speech on the platform. “We see a direct correlation between what is said and how people say things and major acts of violence,” Hayden said.

Researchers at Montclair State University found that following the Colorado Springs, Colo., shooting last month, there was widespread use on Twitter of the term “groomer,” which has been used increasingly in anti-LGBTQ rhetoric.

A memorial for the victims of the shooting at the Club Q in Colorado Springs, Colo.
A memorial for the victims of the shooting at Club Q in Colorado Springs, Colo., on Nov. 22. (Hyoung Chang/Denver Post via Getty Images)

“It was previously noted as a slur on Twitter prior to Musk’s arrival, and restrictions on its use were in place,” Benton said. “After his arrival, a lot of these kinds of restrictions appear to have been loosened. And in the period after the Colorado Springs shooting, what we saw is this slur, this really hateful slur against the LGBTQ+ community, it absolutely exploded on Twitter.”

Overall, there was an 885% spike in the use of the term, which was drawn around a horrific event in which five people were killed and 25 injured at an LGBTQ nightclub in Colorado Springs, Colo.

“What we have is kind of a massive explosion in hate around a dramatic event. When we have things like a contested election, or maybe another health scare like COVID, it’s going to be absolutely free-range on Twitter for more of this content to come out, because it’s exactly what we’re seeing right now,” Benton said. “Hateful people can write fliers in their basement describing their hate however they want to and distribute them however they want to. They’re not necessarily entitled to a billboard or a flashing neon sign with their hate. And yet it seems to be that that’s one of the things that Twitter is providing them.”