U.S. stops helping Big Tech spot foreign meddling amid GOP legal threats

  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.

The U.S. government has stopped warning some social networks about foreign disinformation campaigns on their platforms, reversing a years-long approach to preventing Russia and other actors from interfering in American politics less than a year before the U.S. presidential elections, according to company officials.

Meta no longer receives notifications of global influence campaigns from the Biden administration, halting a longtime practice involving the federal government and the world's largest social media company, senior security officials said Wednesday. Federal agencies have also stopped communicating about political disinformation with Pinterest, according to the company.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

The developments underscore the far-reaching impact of a conservative legal campaign against initiatives established to avoid a repeat of the 2016 election, when Russia manipulated social media in an attempt to sow chaos and swing the vote for Donald Trump. Republican lawmakers even have proposed cutting funding for combating foreign disinformation and subpoenaed government agencies, including the State Department's Global Engagement Center, which counters foreign propaganda.

For months, researchers in government and academia have warned that a barrage of lawsuits, congressional demands and online attacks are having a chilling effect on programs intended to combat health and election misinformation. But the shift in communications about foreign meddling signals how ongoing litigation and Republican probes in Congress are unwinding efforts once viewed as critical to protecting U.S. national security interests.

Ben Nimmo, chief of global threat intelligence for Meta, said government officials stopped communicating foreign election interference threats to the company in July.

That month, a federal judge limited the Biden administration's communications with tech platforms in response to a lawsuit alleging such coordination ran afoul of the First Amendment by encouraging companies to remove falsehoods about covid-19 and the 2020 election. The decision included an exemption allowing the government to communicate with the companies about national security threats, specifically foreign interference in elections. The case, Missouri v. Biden, is now before the U.S. Supreme Court, which has paused lower court restrictions while it reviews the matter.

The litigation and political scrutiny have led to broad uncertainty among foreign policy officials about what communications with tech companies are appropriate, according to a former State Department official, who spoke on the condition of anonymity because of legal risks.

"If you start asking those people to second-guess every time they need to send an email or pick up the phone to do pretty standard work that we've asked them to do on our behalf . . . it's going to make the government less functional," the person said.

The Justice Department, the FBI and the State Department declined to comment. The White House did not respond to a request for comment.

The shift erodes a partnership considered crucial to the integrity of elections around the world - just months before voters head to the polls in Taiwan, the European Union, India and the United States. Ahead of the 2024 U.S. presidential race, foreign actors such as China and Russia have become more aggressive at trying to exacerbate political tensions in the United States, while advanced artificial intelligence allows bad actors to easily create convincing political propaganda.

Sen. Mark R. Warner, the Democratic chair of the Senate Intelligence Committee, said "legal warfare by far-right actors" has led to a dire situation.

"We are seeing a potential scenario where all the major improvements in identifying, threat-sharing, and public exposure of foreign malign influence activity targeting U.S. elections have been systematically undermined," the senator from Virginia said in a statement.

Social media companies have long communicated with law enforcement about threats of child pornography and terrorism, but they did not discuss the threat of Russian interference during the 2016 campaign. Amid revelations of that interference, the firms began meeting with the FBI and Department of Homeland Security officials responsible for protecting elections from foreign interference to share information about potential threats ahead of the 2018 midterms. Tech companies such as Meta, Google and Twitter, now known as X, have also routinely relied on warnings from civil society groups and outside researchers about disinformation threats on their platforms.

"We believe that it's important that we continue to build on the progress the defender community has made since 2016 and make sure that we work together to keep evolving our defenses against foreign interference," Nimmo told reporters on a call.

Missouri v. Biden - and a parallel investigation in Congress led by Rep. Jim Jordan (R-Ohio) - has led to broad legal uncertainty about interactions between the federal government and the tech industry. Most of the allegations in the lawsuit focus on ways federal officials allegedly pressured social networks to remove misleading posts about coronavirus vaccines and elections.

But Meta's announcement suggests that the Biden administration is broadly pulling back from even routine communications with Silicon Valley.

The federal judge's July 4 ruling prohibited key agencies - including the State Department, the FBI and DHS - from urging companies to remove "protected free speech" from the platforms. However, Trump-appointed Judge Terry A. Doughty appeared to acknowledge concerns that the decision could dismantle election integrity initiatives, specifying that the restrictions did not apply to warning companies of national security threats or foreign attempts to influence elections. The 5th Circuit Court of Appeals ruling removed some of the restrictions, including communication with the State Department.

"The fact that the government doesn't have clear guidance creates this instinct to err on the side of caution and just not do anything lest they be seen as doing something problematic," said Evelyn Douek, an assistant professor at Stanford Law School.

The conservative legal strategy is an evolution in a years-long effort to prevent companies from allegedly suppressing GOP views online. In addition to the litigation, Republicans, led by Jordan, have used their control of the House of Representatives to demand documents and testimony about the tech companies' interactions with the Biden administration and accuse the White House of illegally colluding with Silicon Valley.

Jordan said in a statement Thursday that the federal government and tech industry's efforts to combat disinformation have resulted in "the suppression of Americans' voices."

"We will continue to protect Americans' First Amendment rights and put a stop to the censorship industrial complex," he said.

Jordan and other House Republicans have zeroed in on the State Department's Global Engagement Center, which has a mandate from Congress to combat foreign propaganda aimed at influencing the United States and its allies. Jordan called on the House Appropriations Committee to cut funding for the organization, and the increased political scrutiny could hamper efforts to extend the agency's authorization, which is set to expire next year.

Rep. Michael McCaul (R-Tex.), chair of the House Foreign Affairs Committee, and several other Republicans sent a letter to the GEC earlier this year, demanding documents. In a Thursday statement, McCaul said he was "concerned about mission creep" beyond the agency's original goal of combating terrorism.

"My committee intends to exercise its full legislative and oversight jurisdiction over the GEC's lack of transparency and get answers for the American people," McCaul said.

Daniel Kimmage, the principal deputy coordinator of the GEC, said at an October hearing in the House that there was "no substitute" for continued congressional support of the agency.

"We must ensure the United States does not fall behind our adversaries and competitors as they seek to manipulate the global information environment for corrupt and coercive purposes," he warned lawmakers.

During a Senate hearing in October, Homeland Security Secretary Alejandro Mayorkas and FBI Director Christopher A. Wray said that they had overhauled their communications with the tech industry in the wake of the Missouri v. Biden litigation, following questioning from Sen. Rand Paul (R-Ky.).

"We're having some interaction with social media companies, but all of those interactions have changed fundamentally in the wake of the court's ruling," Wray said.

Wray said the changes were made "out of an abundance of caution" to ensure the agency does not run afoul of any court rulings. Mayorkas said DHS no longer participates in periodic meetings with tech companies and other government agencies in which they previously discussed the "threat environment that the homeland faced."

University academics and disinformation research groups are also in limbo. Many are seeking affordable legal representation to defend themselves against mounting cases and reevaluating their communication with industry and the public.

"The trust and safety workers are gone. The relationships with external researchers is now gone," said Anika Collier Navaroli, senior fellow at the Tow Center for Digital Journalism at Columbia University and a former senior Twitter policy official. "And now this third piece of the actual information from the government is gone. . . . So we're basically unprotected."

Nathaniel Gleicher, head of security policy at Meta, said that while the company has resources to detect coordinated attacks on its social networks, the government is often more adept at tracking campaigns that are organized off social media. Before the 2020 U.S. election, Meta dismantled three covert influence operations based in Russia, Mexico and Iran after receiving tips from law enforcement about their off-platform activity, according to Gleicher.

"Our investigators might not know that a campaign is coming until the last minute," he said. "If they are operating off of our platforms, there are a number of times when a tip from [the] government has enabled us to take action."

Influence operations from Russia, Iran and China continue to aim at U.S. domestic targets. Meta said Thursday that it dismantled a group of 4,789 Facebook accounts posing as Americans discussing politics in the United States, often criticizing both sides of the political aisle. Some of those accounts appeared to be copying and pasting content from X onto Facebook, including posts by elected officials. In some instances, the network amplified X owner Elon Musk's tweets on his platform.

The threat of such campaigns might only grow as the 2024 presidential campaign heats up. Meta warned that if the Russia-Ukraine war or U.S.-China relations become hot-button election issues, it expects foreign influence operations to target those debates, as well.

Renée DiResta, a technical research manager at the Stanford Internet Observatory, said the 2022 midterms showed that both political parties are vulnerable to these campaigns.

"These operations are real, they are global, and they target all political parties and positions - this is not a partisan issue," she said. "In the U.S. 2022 midterms, we saw Iran targeting the progressive left and China targeting both the left and the right to advance state interests."

Graham Brookie, vice president and senior director of the Atlantic Council's Digital Forensic Research Lab, said China-based foreign influence campaigns have evolved to spread conspiracy theories or target leaders.

"It's not getting better," Brookie said. "The cost of engaging in foreign influence activities, especially in online information environments, has not gone up for bad actors."

- - -

Joseph Menn contributed to this report.

Related Content