The inside story of the ‘Facebook Files’ from whistleblower Frances Haugen

Michelle Budge, Deseret News
Michelle Budge, Deseret News
  • Oops!
    Something went wrong.
    Please try again later.

The heavy wooden doors parted in front of me and I walked into a United States Senate hearing room as if on autopilot. I was running on maybe 412 hours of sleep and with each step felt like I was walking through mud, forcing myself forward. If you had sat Jeff Horwitz, Wall Street Journal technology reporter, and me down on my last day at Facebook and proposed so much as the idea, the possibility, that 412 months later I would be walking into a Senate hearing to testify about what was really going on at the company, that I would be “coming out” publicly as the whistleblower, we would have been horrified. “Absolutely not,” I would have said. But now I was sitting at a table, the senators arrayed in front of me, without him.

Jeff had become my most consistent friend for the previous nine months. Others had provided support off and on, as I had moved between different COVID‑19 housing permutations, but Jeff had been a constant. He was my rock. He had believed me when I felt alone at Facebook. He had given me the support to follow my conscience and been the best collaborator I could have had throughout it all.

The committee was called to session and went by in a blur. The chairs read opening statements, and then I read mine. After the obligatory opening statements, I launched into the heart of the matter: “My name is Frances Haugen. I used to work at Facebook and joined because I think Facebook has the potential to bring out the best in us. But I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy, and much more.”


Just six weeks earlier, at the start of August, the Commerce Committee had drawn attention in a public letter to a few comments Mark Zuckerberg had made when he had been hauled in front of Congress in March of 2021 to explain Facebook’s response to the events of January 6. He had stated he believed Facebook was conducting research on the effect of its products on children’s mental health and well‑being. The committee’s public request was clear: “Has Facebook’s research ever found that its platforms and products can have a negative effect on children’s and teens’ mental health or well‑being, such as increased suicidal thoughts, heightened anxiety, unhealthy usage patterns, negative self‑image, or other indications of lower well‑being?” The senators demanded to see any research Facebook had concerning children’s mental health.

Before the release of the Facebook Files, my lawyers reached out to the Senate Commerce Committee’s Consumer Protection Committee to brief them on the upcoming news. This committee had been investigating social media for months, and they jumped at the idea of my testifying after I came out.

Former Facebook data scientist Frances Haugen speaks during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Tuesday, Oct. 5, 2021, in Washington. A quote from Haugen’s testimony made a list of noteworthy quotes assembled by Fred Shapiro, an associate director at the Yale Law library. He said he picks quotes that are important or revealing of the spirit of the times, not because they are necessarily eloquent or admirable. | Jabin Botsford, The Washington Post via Associated Press

Two weeks later, when Facebook’s response was due for release to the Senate, Jeff had gotten his hands on a copy of the company’s statement in response and called me to get my thoughts. My eyes widened as we read through it together, amazed at Facebook’s hubris. Facebook had a pile of research interviews with children who said they compulsively used Instagram even though it made them unhappy, interviews their own researchers summarized as being “addicts’ narratives”: It makes me unhappy; I can’t stop using it; if I leave, I’ll be ostracized. And yet Facebook’s response to Congress was little more than a shrug: “We are not aware of a consensus among studies or experts about how much screen time is ‘too much,’” they responded, gracefully sidestepping the actual question asked, before producing 800 words detailing various media literacy programs they had funded and user experience controls they had built.

Facebook was technically correct that there was no consensus in academia or the public about what amount of social media was “too much.”

That was in no small part because Facebook kept all of its data hidden and wouldn’t let academics have access to it for studies. But Congress hadn’t asked how much social media was a “bad” amount of social media; they had asked whether Facebook’s research ever indicated that kids might be harmed by Facebook’s products ... and now I was sitting in front of congressional staffers who were thumbing through printouts of that very research. They were not pleased.

“Instagram is probably a positive force in many kids’ lives, but like all harms on social media, the negative impacts of the platform are not evenly distributed.”

Instagram is probably a positive force in many kids’ lives, but like all harms on social media, the negative impacts of the platform are not evenly distributed. For any given type of harm caused by the platform, whether we’re talking about kids or adults, you should expect around 10% of users to bear an outsized amount of the burden, because that small fraction comprises by far the heaviest users. When it comes to consuming content on Facebook, we see that the top 1% of adult users might consume two thousand to three thousand posts per day, while the average user only consumes 20 or 30. In the case of “problematic use” — Facebook’s phrase for Facebook addiction — Facebook researchers found that kids’ problematic use rates rose from 5% when they were 14 years old to 8% when they were 16.

That might not sound too bad, but in 2018, when the study was done, Facebook had long since ceased being the most popular social media platform for teenagers. Instagram and Snapchat held that distinction.

If 8% of 16-​year-​olds could admit that they had “problematic use” on Facebook, imagine how many more might have admitted problems with Instagram, let alone the ones who would have denied it despite struggling. And that was just compulsive use. Other harm types were concentrated as well. Thirty-​two percent of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse. Other internal documents talked about self-​harm communities that encouraged kids and adults to hurt themselves.

How many of the platform’s current 14‑year‑olds had been on the platform already, for two, three, four or more years? Suddenly there would be a yardstick to compare how much each platform was actually doing to protect children. It would take Facebook less than a day, and only once, to produce a pipeline that would provide that ongoing report. They don’t share it with the public because they don’t want the public to know the true extent of underage children on the platform.

The Senate hearing room inside the Russell Building was a long rectangle, and we entered through a side door. Running the length of the chamber was a long U‑shaped dais where the senators sat. The senators seemed to loom above me, but I’ve since looked at pictures of that day, and the floor is level. I guess it says something about the intimidation I felt when they eventually took their seats. Only you can make yourself feel small. I walked over to the witness table and sat down. Out of nowhere press photographers appeared and swarmed around me for about two minutes. Once the hearing began, they wouldn’t be allowed to get between the senators and me, and they were making the most of their remaining minutes of access.

I spoke for just a few minutes. Each committee member took their five minutes to ask questions, sometimes (thankfully) veering off into a monologue for minutes at a time, which gave me a chance to regroup. The questions were largely good and relevant. If the staffers had seemed upset that Facebook had lied by omission about the impacts of their products on children, the senators gave the impression that they were even angrier.

Related

Woven through my answers was my attempt to lay out the argument as cleanly as possible: Facebook was full of kind, smart, conscientious people, who were limited in how they could act by a system of governance that showed no signs of changing on its own. Facebook had built a corporate culture that valued “objective” measures like computable metrics over human judgment.

This empowered their youngest employees, who fueled the social media innovation that the company required. But the philosophy consistently missed preventable problems until too late. It’s hard to measure the impact of stopping a fire before it rages, and corporate cultures too wed to navigating by “objective” numbers instead of human judgment are at risk of creating “arsonist firefighters.” Facebook’s problems had grown so severe because they knew they controlled all the cards. Its closed software ran on data centers no one could inspect, as a result it could deny, deflect, or minimize any concern brought to its leaders about their products.

“Facebook will not change as long as the incentives do not change — as long as the only metrics they have to report externally are about the economics of the business, they will keep paying for those profits with our safety.”

Facebook will not change as long as the incentives do not change — as long as the only metrics they have to report externally are about the economics of the business, they will keep paying for those profits with our safety. These problems are not isolated to Facebook. I had worked at other algorithmic companies, and understood why sensible people at Facebook had made the choices they’d made. The same challenges exist in any place where the public lacks transparency to ensure conflicts of interest between safety and profits are resolved in the public’s best interest. Hope and change are possible.

We can demand transparency and the social pressure that comes with it to incentivize changing algorithm design and the design of social platforms themselves. Facebook claims we must choose between the status quo and extreme censorship. This is not true, Facebook knows how to solve its problems without picking “good” or “bad” ideas, it just feels it can’t be the first mover to use those interventions or the stock price would plummet.


A year later, my message was the same as the one I had delivered in my opening address to the Senate: “These problems are solvable. A safer, more enjoyable social media experience is possible.”

Corporations are facing a fundamental decision over the next few decades. We are entering an era in which employees understand they’re disposable, while also understanding that the systems they work on, the systems that fuel our economy, are more opaque to those outside the corporate walls than they were in the past. The next generation of employees understands in a very different way than preceding generations that if they don’t act to share information with the public about black box systems that run on chips and data centers, the public won’t get the information it needs to provide oversight to tech companies, with potentially deadly consequences. The future is likely to bring many more Frances Haugens.

Companies will have to decide whether they want to be transparent with the public, and therefore not have to fear their employees, or whether they want to police their employees and accept that they’ll lose the best talent. People don’t like to have to keep secrets. People don’t like to lie. Neither activity is free. And the benefits of lying are in fact long‑term liabilities.

Facebook didn’t want to see their lies as liabilities. When the truth broke, they lost users and advertisers, and had to dramatically increase their safety spending. Twice in the last five years, first in 2018 and then again in 2021, Facebook/Meta set the record for the largest one‑day value drop in stock market history, both times when their narratives met reality head on. We should expect to see this over and over again with opaque companies in the future. Lies are liabilities. Truth is the foundation of longterm success.

This article is a modified excerpt from Frances Haugen’s “The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook,” published by the Hatchet Book Group. Haugen is an American data engineer and scientist who worked at Google, Yelp and Pinterest before joining Facebook in 2019 and working in its civic integrity department. In the spring of 2021, she gave tens of thousands of internal Facebook documents to the Securities and Exchange Commission and The Wall Street Journal.