Facebook staff complained for years about their lobbyists’ power

Facebook says it does not take the political winds of Washington into account when deciding what posts to take down or products to launch.

But a trove of internal documents shows that Facebook’s own employees are concerned that the company does just that — and that its Washington, D.C.-based policy office is deeply involved in these calls at a level not previously reported.

The lobbying and government relations shop, overseen by former Republican operative Joel Kaplan, regularly weighs in on speech-related issues, such as how to deal with prominent right-wing figures, misinformation, ads from former President Donald Trump and the aftermath of the George Floyd protests in June 2020, according to internal reports, posts from Facebook’s staff and interviews with former employees. The dynamic is so prevalent that employees argued internally that Facebook regularly ignored its own written policies to keep political figures happy, even overriding concerns about public safety.

“Facebook routinely makes exceptions for powerful actors when enforcing content policy,” a Facebook data scientist wrote in a December 2020 presentation titled “Political Influences on Content Policy.” It added: “The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies.” The public policy team includes the company’s lobbyists.

The new disclosures come after years of internal and external grumbling at Facebook about the role played by Kaplan, who has angered Democrats in Washington who say he has amassed outsized power at the company and used his position to cater to the GOP. They also follow recent revelations in The Wall Street Journal that Facebook has a private internal system, known as XCheck, that exempted high-profile users such as Trump from the company’s normal rules.

The latest disclosures are likely to add to those complaints — even as Facebook continues to insist, as a spokesperson did Friday, that Kaplan’s team is “just one of many groups consulted” on content decisions.

"It is a fatal flaw of Facebook as a company that their team in charge of lobbying governments clearly is empowered to intervene on product and content decisions in ways that make it impossible to do good work," said Jesse Lehrich, co-founder of advocacy group Accountable Tech and former spokesperson for Hillary Clinton.

Political concerns weigh on content decisions

This story is based on seven internal documents as well as four interviews with former Facebook employees and people familiar with Facebook’s operations. The documents are included in disclosures made to the Securities and Exchange Commission and provided in redacted form to Congress by the legal counsel for Facebook whistleblower Frances Haugen. The redacted versions were reviewed by a consortium of news organizations including POLITICO.

According to the documents, sensitive content moderation decisions and significant changes to Facebook’s news feed and news and recommendations features undergo a review process in which public policy team members come to the table alongside other teams and weigh in with potential political concerns.

One Facebook manager defended public policy’s teams’ role within the company’s decision-making process in an internal post on Aug. 18, 2020, following concerns that the team handling policy in India made special exceptions for hate speech by a member of Prime Minister Narendra Modi’s party. (That team reports to Ajit Mohan, Facebook's vice president and managing director in India.)

"Public policy teams are important to [the] escalations process in that they provide input on a range of issues, including translation, socio-political context, and regulatory risks of different enforcement options,” the manager wrote. The manager added that the policy teams’ perspective is one of many that Facebook leaders take into account before making decisions.

Kaplan’s public policy team was one of the key groups overseeing XCheck.

But the team’s influence extends beyond that. Kaplan’s team intervened to protect right-wing figures such as provocateur Charlie Kirk, the conservative publication Breitbart and activists Diamond and Silk from consequences for violating Facebook’s policies against misinformation, according to the 2020 document, excerpts of which previously appeared this year in BuzzFeed News. And CEO Mark Zuckerberg has gotten personally involved in dictating how content moderation will operate at the company, according to the document and several internal posts.

Differences from Twitter and Google

In message board conversations dating back to 2019, Facebook employees took particular issue with one aspect of the company’s internal structure: The teams charged with writing and enforcing Facebook’s content rules answer to Kaplan, the company’s vice president of global public policy.

That’s different from other social media companies such as Twitter, which separates its “trust and safety” team — a group that handles difficult questions around online speech — from its policy team, which interacts directly with governments around the world. A similar firewall exists at Google, where the heads of the company’s content and public policy teams answer to different executives.

“People bring up over and over again this idea that having both those functions tied to the same group is dangerous because they have different interests,” Haugen said in a virtual briefing arranged by her public relations team with POLITICO and other news outlets on Oct. 15.

Facebook spokesperson Corey Chambliss said the company’s content policy and public policy teams “operate independently” and that the content policy team relies on input from teams throughout the company, including “Operations, Engineering, Legal, Human Rights, Civil Rights, Safety, Comms and Public Policy.”

“In these instances Public Policy is just one of many groups consulted,” Chambliss said. “And, while the perspective of Global Public Policy is key to understanding local context, no single team’s opinion has more influence than the other.”

Kaplan did not respond to a request for comment on his role in the decisions. But Facebook spokesperson Joe Osborne said in a statement Sunday that “recycling the same warmed over conspiracy theories about the influence of one person at Facebook doesn’t make them true.”

“The reality is big decisions at Facebook are made with input from people across different teams who have different perspectives and expertise in different areas,” Osborne said. “To suggest otherwise is inaccurate.”

‘There should be a firewall’

Even so, the dynamic among the teams has drawn scrutiny in previous years. At one point, Kaplan’s team intervened to pare back proposals aimed at improving civic discourse on the platform for fear they would anger conservatives.

“There should be a firewall between the two teams,” said Evelyn Douek, a Harvard scholar who researches private content moderation. “As long as they are representing that [political] considerations don’t play into how they do content moderation, they should make that real and have an internal structure that mirrors their external representations. That is something that other platforms have done."

The newly obtained documents show that employees had concerns and warnings about a host of content decisions.

The 13-page December 2020 presentation takes issue with the public policy team’s insistence on applying exempting right-wing publishers from punishment for spreading misinformation, Facebook’s decision to overturn a fact-check on a post that claimed “abortion is never medically necessary” after an uproar from Republican politicians, and Facebook’s “newsworthiness” exception to its misinformation policy, which allowed political figures to speak more freely than other users.

The author — whose name is redacted in the copy POLITICO reviewed — said that “almost all” the examples had been reported in the news media, “but I thought it’s worth documenting in a single note.”

The document also calls out the public policy team’s involvement in broader product changes and launches at Facebook.

“When significant changes are made to our algorithms (ranking, recommendations) they are usually reviewed by staff from public policy,” reads the December 2020 document. “Public policy typically are interested in the impact on politicians and political media, and they commonly veto launches which have significant negative impacts on politically sensitive actors.”

While the document did not mention any specific product launches, Haugen said that during her nearly two years with the company, Facebook rolled back a change to the platform that would have reduced misinformation because it disproportionately affected right-wing users.

“There’s this tiny sliver of power users and they're sharing tens or hundreds of times a day,” Haugen said. “If you just remove the content from those people, misinformation drops dramatically, like we're talking 20 or 30 percent dramatically.”

But she said the company concluded that because “there was more of an impact on the right than the left, this was not an acceptable intervention.”

Making the president happy?

At Facebook, the content policy and public policy teams in the U.S. both report to Kaplan, who in turn reports up to Nick Clegg — Facebook’s vice president for global affairs and communications and public-facing political fixer. And the public policy team is heavily involved in content and product-related decisions, two former Facebook employees who worked with the public policy team said in interviews with POLITICO.

“When you have the head of content policy reporting to a lobbyist who has to make the president happy, that’s an unhealthy dynamic,” said one of the former employees, who requested anonymity because they signed a non-disclosure agreement when they left the company. “It was often that making the president happy was the top priority.”

Beyond that structural concern, Zuckerberg and other top executives regularly weigh in on content decisions, according to the document and a slew of previously publicized examples.

Zuckerberg is not necessarily unique in this. At Twitter, CEO Jack Dorsey was involved in the decision to ban Trump from the platform after the Jan. 6 assault on the Capitol by a throng of his supporters (although the official call was made by Vijaya Gadde, Twitter’s top lawyer and safety expert). But Zuckerberg’s heavy involvement in content decisions resonates differently as Facebook continually insists that it is focused on enforcing its rules fairly, regardless of political considerations.

"In multiple cases, the final judgement about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg,” the 2020 document reads. “If our decisions are intended to be an application of a written policy then it's unclear why executives would be consulted.” The author said the regular reliance on senior management suggests “an unwritten aspect to our policies, namely to protect sensitive constituencies.”

In October 2020, one Facebook employee, whose name was redacted, posted a comment on an internal board saying Zuckerberg had expressed “a strong preference for a more reserved approach to moderation overall” in closed-door conversations.

But Zuckerberg remains involved in day-to-day content-related activity. On Jan. 19, a team of internal researchers found that Facebook was continuing to recommend political groups to users, despite an earlier pledge that it would temporarily stop pushing users to join those groups in the run-up to the 2020 election, according to an internal post. (Facebook repeated the pledge on Jan. 11, saying it was trying to reduce divisiveness on the platform.) The researchers escalated their investigation into why that had happened, the post shows.

When one employee asked in an internal discussion board why the matter had been “upleveled,” another person responded, “This is related to an ongoing PR related event that’s being reviewed with Mark.” (Facebook employees commonly refer to Zuckerberg by his first name.) The investigative tech publication The Markup had posted an article that day showing that Facebook was continuing to push partisan political groups to users.

Chambliss, the Facebook spokesperson, said the recommendations had persisted because of “technical issues in the designation and filtering process that allowed some Groups to remain in the recommendation pool when they should not have been.”

“Since becoming aware of the issue, we worked quickly to update our processes, and we continue this work to improve our designation and filtering processes to make them as accurate and effective as possible,” Chambliss said.

Zuckerberg’s involvement in some high-profile content decisions has already prompted public criticism of Facebook, including after he rejected calls to remove or penalize a May 2020 post in which Trump warned that “if the looting starts, the shooting starts.” The then-president’s message came as racial justice protests were spreading across the country, and many Facebook employees joined outside critics in calling the post a violation of the platform’s rules against threats of violence.

On June 1, 2020 — the same day Facebook employees staged a walkout in support for civil rights — one staffer posted a warning on the company’s internal message boards about the influence of political considerations on the company’s approach to racial justice.

“I have heard from many colleagues on the content policy team that they feel pressure to ensure their recommendations align with the interests of policymakers,” the employee wrote in a post titled “Bending Our Platforms Towards Racial Justice.”

“They attribute this to the organizational incentives of having the content policy and public policy teams share a common root,” the employee continued. “As long as this is the case, we will be prematurely prioritizing regulatory interests over community protection.”

Policy teams just ‘doing their jobs’

Facebook started ignoring warnings about the outsized role of its public policy team as early as September 2018, according to one former company employee who spoke to POLITICO on condition of anonymity. The public policy team pushed to allow the Trump campaign to run an ad shortly before the midterm election claiming that murderous immigrants were seeking to invade the U.S, the person said. Ultimately, the company dismissed the public policy team’s input and publicly said it would no longer run the ad, but only after the Trump campaign had spent thousands of dollars running it on Facebook amid enormous public scrutiny.

Chambliss did not deny this account but said Facebook “evaluated this ad against our rules and determined it violated our policies. All advertisers must comply with our ad policies regardless of whether someone is a politician.”

Katie Harbath, who served as a public policy director with Facebook until she left in March of this year, said critics are oversimplifying the issue. Public policy represents only one of myriad viewpoints that executives consider when they make difficult decisions, she said.

“They’re doing their jobs by providing what the political fallout would be,” Harbath said of the policy team. “The company would be doing a disservice if they didn’t at least know what those potential fallouts could be.”

Harbath, a longtime Republican and former party official, said she herself spent time at Facebook explaining to engineers how particular product decisions could upset conservatives — not out of fear of backlash, but because it was her job to make sure the company knew how its decisions would resonate with people on both sides of the aisle.

Harbath said the real question is to what extent top executives, such as Zuckerberg and Chief Operating Officer Sheryl Sandberg, consider political retribution when they make particular decisions.

It’s also unclear how nimbly the policy shop has been in adjusting to the changed political atmosphere of the Biden era.

Whereas much of the Trump administration’s scrutiny of Facebook revolved around allegations of anti-conservative bias, the Biden administration has made it clear that its priority for the platform is vaccine misinformation. President Joe Biden earlier this year accused Facebook of “killing people” by failing to take stringent enough action against misinformation about Covid-19 vaccines.

Facebook has undertaken significant measures to crack down on lies about Covid-19, the internal documents show. But still its efforts have fallen short as lies and skepticism about vaccines flood Facebook comments section, according to research compiled by Facebook data scientists.

The Biden administration has been hammering Facebook for failing to take down the accounts of the “Disinformation Dozen,” a list of 12 influential people that outside researchers have identified as being responsible for more than half of vaccine misinformation on Facebook. The company has largely gone on the defensive in response — including adding some prominent Democrats to its D.C. lobbying team.

The bad blood fueled by Facebook’s past content decisions still lingers, however.

“I think they’ve gone out of their way to hire Democrats that have good relationships with folks on the Hill and with the Biden admin,” said Lehrich, the Accountable Tech co-founder. “But at the end of the day, the relationship became so adversarial between Facebook and the Biden campaign and the party writ large, it certainly doesn’t swing fundamentally from trying to appease the right to trying to appease the left all of the sudden.”

CORRECTION: Because of incorrect information provided by Facebook, an earlier version of this report misstated who Facebook's India public policy team reports to. It is Facebook's vice president and managing director in India, Ajit Mohan.