Meta whistleblower warned executives about social media's dangerous effects on young teens. They ignored him

Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing to examine social media and the teen mental health crisis, on Tuesday, Nov. 7, 2023, on Capitol Hill in Washington.
Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing to examine social media and the teen mental health crisis, on Tuesday, Nov. 7, 2023, on Capitol Hill in Washington. | Stephanie Scarbrough, Associated Press
  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.

Sen. Richard Blumenthal painted a stark picture of Facebook ignoring the threats that its products pose to children and teenagers at a subcommittee hearing on Tuesday.

Two years ago, Blumenthal, D-Conn., chairman of the subcommittee, and Sen. Marsha Blackburn, R-Tenn., held a hearing featuring whistleblower Frances Haugen.

“My name is Frances Haugen. I used to work at Facebook and joined because I think Facebook has the potential to bring out the best in us. But I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy, and much more,” Haugen said.

On the same day as the hearing, Arturo Bejar, the then-director of Engineering for Protect and Care at Facebook, emailed Meta CEO Mark Zuckerberg, Instagram CEO Adam Mosseri and other executives with documents that validated Haugen’s testimony, Blumenthal said in his opening remarks.

This email, first reported by The Wall Street Journal, outlined Bejar’s belief there is “a critical gap” in the way the company approached harm, and in order to mitigate this, the company would need to admit the current approach used by Meta experiences wasn’t working.

Related

Earlier this year, Haugen published a book recollecting her time as a data engineer working at Facebook, which led her to give the Securities and Exchange Commission internal company documents that alleged the platform had negative effects, like depression or body dysmorphia, on young users. (Read an excerpt of the book here.)

Meta whistleblower testimony reveals disturbing data

Bejar never received a reply to his email. Two years later, he said the situation remains unchanged, leading the data engineer to testify in front of Congress on Tuesday.

“I appear before you today as a dad with firsthand experience of a child who received unwanted sexual advances on Instagram,” he said in his testimony.

“She and her friends began having awful experiences, including repeated unwanted sexual advances, harassment. She reported these incidents to the company and it did nothing,” Bejar said.

Bejar’s documents indicated that a quarter of young teens, 13 to 15 years of age, reported receiving sexual advances on Instagram, while a third of them witnessed discrimination on the basis of gender, religion, race or sexual orientation, said Blumenthal, a Democrat from Connecticut.

When using the social media platforms, this demographic also experiences feeling “worse about themselves, about their bodies and their social relationships — the type of experience that leads to serious depression and eating disorders,” said Blumenthal.

Meanwhile, around 7% of Facebook users encountered content promoting suicide and self-harm, with young teens seeing this content more often than adults, he said.

What’s most striking is that Facebook only responded to 2% of complaints made by users about harmful content, Blumenthal added.

Bejar told The Associated Press that mitigating these effects isn’t complicated: “Just give the teen a chance to say ‘this content is not for me’ and then use that information to train all of the other systems and get feedback that makes it better,” he said.

Meta has not made any major reforms

In response to the hearing, Meta spokesperson Andy Stone said in a statement, “Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” according to CNBC News.

“The issues raised here regarding user perception surveys highlight one part of this effort, and surveys like these have led us to create features like anonymous notifications of potentially hurtful content and comment warnings,” Stone said.

“Working with parents and experts, we have also introduced over 30 tools to support teens and their families in having safe, positive experiences online. All of this work continues.”

Earlier in June, Meta launched a task force to investigate allegations of Instagram hosting the distribution and sale of self-generated child sexual abuse material, as The Guardian reported. But there hasn’t been any major reforms announced so far.

“We can no longer rely on social media’s mantra ‘trust us,’” said Blumenthal. He advocated for the passage of the Kids Online Safety Act.

As the Deseret News previously reported, this bill shares many similarities with the Utah Social Media Regulation Act, which was the first law of its kind to require age regulation for all social media users. The Utah law also gave parents more control over their children’s social media accounts.

The Kids Online Safety Act has bipartisan support, with 46 co-sponsors, which is nearly half the U.S. Senate.

“See, there are laws in the physical world that protect children from all of this. But online, it’s been the Wild West,” said Blackburn, a Republican from Tennessee, at the hearing.

“We have fought this army of lobbyists for years,” she said, adding, “Big Tech has proven they are completely incapable of governing themselves, of setting up rules, of having guidelines, of designing for safety.”