Twitter blocks hashtags used to promote child sex abuse material after NBC News review

Twitter on Saturday blocked searches for a series of hashtags and keywords used to promote the sale of child sex abuse material (CSAM) following an investigation by NBC News posted the day before.

NBC News found that a series of hashtags on the platform related to the file-sharing service Mega served as rallying points for users seeking to trade or sell CSAM. NBC News observed the hashtags over a period of several weeks, and counted dozens of users who collectively published hundreds of tweets daily.

The accounts used thinly veiled keywords and terms related to CSAM to promote the content they said was stored on Mega, which they said was available for purchase or trade.

Twitter prohibits any promotion of CSAM on its platform and, since CEO Elon Musk took over the company, he has vocally criticized the company’s former leadership, claiming they didn’t do enough to address child sexual exploitation material on the platform. Musk in November said cleaning up the platform and addressing child exploitation on it was his “priority #1.”

NBC News’ examination of the hashtags found that some accounts had been using them for months, and that dozens of users had tagged Musk using the hashtags attempting to alert him to the issue. The hashtags, however, appeared to remain largely unmoderated until Saturday.

Following NBC News’ report Friday, Ella Irwin, Twitter’s vice president of product trust and safety, which includes overseeing child safety on the platform, said in an email, “We will do a specific additional review this weekend and see what we find. As you probably know the links you shared relate to a file sharing service broadly used for a wide variety of purposes and so that makes it much harder to find the specific illegal content being posted using the hashtags in question.”

In a follow-up email Saturday, Irwin said she met over the weekend with her team and decided to ban the hashtags, which she said had been under review by the company.

“We were already reviewing doing this in the coming weeks, given that we have banned other hashtags used commonly for trafficking [CSAM] material already, however we made the decision to accelerate this action for these terms,” she said.

Irwin said in the last six weeks Twitter had been analyzing thousands of hashtags for a project that was scheduled for completion in the next few weeks. She noted that the company did not want to ban hashtags that had a legitimate use, but in this case the company decided to act.

“If bad actors are successfully evading our detection using these specific terms and in spite of our detection mechanisms currently in place, then we would rather bias towards making it much harder to do this on our platform,” Irwin wrote.

In a review of hashtags and tweets last week, NBC News confirmed that searches related to the file-sharing site Mega had been blocked. Other hashtags related to different encrypted platforms and other keywords associated with CSAM were still active.

In an email Friday, Mega’s Executive Chairman Stephen Hall said that the encrypted service, which is based in New Zealand, had a zero-tolerance policy toward CSAM. “If a public link is reported as containing CSAM, we immediately disable the link, permanently close the user’s account, and provide full details to the New Zealand authorities, and any relevant international authority,” Hall wrote.

In an email Tuesday, Hall reacted to the news of the Mega-related terms being blocked on Twitter by writing that it was "a rather blunt reaction to a complex situation.”

Despite Musk’s claims that he’s prioritizing the elimination of CSAM on Twitter, layoffs and staff reductions have appeared to throttle the company’s Trust and Safety group, which houses employees overseeing child safety.

According to Securities and Exchange Commission documents and internal records obtained by NBC News, less than half the number of employees now work in trust and safety at the company than did at the end of 2021. According to Bloomberg, the trust and safety team underwent further cuts this month.

A former employee who asked to remain anonymous because they had signed a nondisclosure agreement said that many of the employees specifically tasked with child safety issues had departed the company.

Irwin said in an email that Twitter has “roughly 25% more staffing on this issue/ problem space now than the company had at its peak last January.” She said “many employees who were on the child safety team last year are no longer part of the company but that primarily happened between January and August of last year due to rapid attrition Twitter was experiencing across the company.”

This article was originally published on NBCNews.com