- Oops!Something went wrong.Please try again later.
- Oops!Something went wrong.Please try again later.
WASHINGTON — The social media giant Facebook has spent years refusing to act as an “arbiter of truth,” which in practice has meant allowing political figures to publish demonstrably false and sometimes racist information on its platforms.
But with the company under fire from both the left and the right, Facebook has begun offering concessions to activists as it braces for new federal regulations.
Faced with an advertiser boycott, one of CEO Mark Zuckerberg’s top lieutenants recently denied that the company makes money by spreading hate. And on June 26, Zuckerberg said Facebook would — for the first time — affix labels to some posts that might violate its standards, one week after the company removed a Trump campaign ad that included what critics charged was a Nazi-era symbol.
This reversal came after Zuckerberg had doubled down May 28 on his long-held position against calling out false statements. “I believe strongly that Facebook shouldn’t be the arbiter of truth of everything that people say online,” he said then on Fox News.
On June 30, Facebook tweaked its algorithm to prioritize original sources of information rather than derivative articles from nonjournalism websites. And on July 1 the company’s global affairs point man, the former British politician Nick Clegg, wrote a post headlined “Facebook does not benefit from hate.”
Although no stranger to bad press, Facebook has been dealing with an unprecedented set of challenges in recent weeks. Major corporations and retailers have joined a growing boycott of Facebook advertising, blaming the company for spreading bad information and vitriol. The corporations now boycotting it include Coca-Cola, Ford and Verizon, the parent company of Yahoo News.
Much of the uproar, which began internally among Facebook employees, is over its treatment of President Trump. While Twitter has recently taken the lead in fact-checking Trump’s false statements, Facebook has been accused of giving Trump preferential treatment.
And at the same time, the chances of Democrats winning back the White House this November have gone up considerably, in addition to the possibility of Democrats winning a Senate majority.
Much more than the ad boycott, the prospects of new regulation and antitrust investigations are an existential concern for the company and other tech giants like Google and Twitter. And according to tech experts, recent moves by all these companies should be viewed through the lens of how they are trying to position themselves to influence the regulation that most people in the industry say is inevitable.
More robust oversight of Silicon Valley is all but assured if Democrats take control of the White House and Congress this fall, and could pose a significant challenge for companies like Facebook. Proposals to break it up into smaller pieces — such as spinning off its massively popular Instagram and WhatsApp platforms — would suddenly be on the table.
In that scenario, the Republican Party, even in minority status, might be Facebook’s only hope of staving off harsh regulations, multiple experts and political insiders told Yahoo News. Democrats are more ideologically inclined toward the regulation of large corporations than Republicans, who have historically taken a more laissez-faire approach to federal oversight.
At the same time, some ambitious GOP lawmakers have also expressed their displeasure with social media companies, which they accuse of suppressing conservative speech.
These conflicting pressures have led Facebook to try to appease critics from across the political spectrum. On Tuesday, Facebook met with representatives from left-leaning advocacy groups and is set to release a review of its civil rights policies. At the same time, the company is likely to go only so far in trying to pacify complaints from progressives, in large part because it’s worried that it could alienate its erstwhile Republican defenders.
Tech execs will face the scrutiny of top House Democrats later this month, when Rep. David Cicilline will hold an open hearing in his antitrust subcommittee in the House Judiciary Committee. Zuckerberg will testify, along with the CEOs of Amazon, Apple and Alphabet, the parent company of Google. The testimony is likely to be virtual rather than in person, a committee spokesman told Yahoo News.
Advocates for regulation say there’s no other way to stop Facebook from spreading misinformation and hate speech.
“The problem is that [Facebook’s algorithm] boosts the hateful content because that is what is most engaging and serves Facebook’s business model. I don’t think you can fix this problem by these little Band-Aids that Facebook has been offering,” said Sally Hubbard, an antitrust expert at the Open Markets Institute, which was formed in 2017 to push for greater government oversight of big tech.
Hubbard says Facebook’s business model creates a perverse incentive structure that rewards hate speech. Its algorithm prioritizes “engagement” — the reactions and responses from users on its platforms. And often it is false and inflammatory posts that draw the most engagement, as people argue with or rebut a post in comment sections. But thanks to the company’s algorithm, those responses only amplify the post and push it into the feeds of more users.
This engagement keeps users on the page, which gives Facebook a greater capacity to serve ads to them, increasing its profits. And because Facebook collects so much information about each user, it can provide highly targeted ad campaigns to businesses and political activists looking to reach as many people as possible. This also increases the value of the company’s advertising and ups its profits.
“Despite copious evidence to the contrary, too many policy makers and journalists behave as if internet platforms will eventually reduce the harm from targeted harassment, disinformation, and conspiracies through content moderation,” wrote Roger McNamee, an early investor in Facebook, in Time magazine recently.
“The sad truth,” continued McNamee, who is also author of a book critical of the company, “is that the content we have asked internet platforms to remove is exceptionally valuable and they do not want to remove it.”
Hubbard wants to bar Facebook from allowing such highly targeted advertising, but she and others think that breaking up Facebook and parts of Alphabet is also necessary to stop these companies from collecting too much information about users.
“You can’t stop the manipulation of voters by foreign interests or anyone who wants to interfere with the elections without putting a halt to the manipulation that the business model is built for,” Hubbard said. “The reason Facebook won’t do this is because it made almost $70 billion last year off this business model.”
Facebook claims it is motivated to reform by altruism. “We are making changes — not for financial reasons or advertiser pressure, but because it is the right thing to do,” Sheryl Sandberg, the company’s chief operating officer, wrote Tuesday morning.
Hubbard also argued that breaking up the big tech companies would allow for more competition from smaller companies that want to join the digital public square, giving companies like Facebook less of a monopoly on driving internet content.
McNamee argued in his Time editorial that Congress should create “an exception to the safe harbor of Section 230 of the Communications Decency Act for algorithm amplification of harmful content … guaranteeing a right to litigate against platforms for this harm.” This would essentially remove the blanket immunity that internet companies currently have from litigation for amplifying offensive content and misinformation.
Missouri Republican Sen. Josh Hawley has been a champion of changing portions of Section 230 to crack down on big tech companies, and other Republicans, like Sen. Ted Cruz and Sen. Marco Rubio, have expressed support for that idea. Fox News personality Tucker Carlson has also been highly critical of big tech and has locked horns with the conservative Heritage Foundation over the issue of regulation.
Meanwhile, Facebook has already been the most aggressive of the big tech companies in reaching out to Republicans in the Trump era. Joel Kaplan, a former deputy White House chief of staff under President George W. Bush, has helped arrange meetings between Zuckerberg and Republican leaders in Congress, as well as with right-wing media outlets. Last year, Kaplan also set up a meeting with Trump and Zuckerberg. That was followed by a private White House dinner in October attended by Zuckerberg, the president and the first lady, and a few others.
“If people who are in a position to regulate you have a positive view of you, you’re in a position to lobby on your behalf and you’ll have more of a say in whatever kind of regulation comes out,” said Shannon McGregor, a journalism professor at the University of North Carolina who studies the impact of social media on politics.
And news reports indicate that Facebook has already had some success at pushing back on government scrutiny. The New York Times’ Ben Smith reported that Justice Department antitrust investigations into Google and Amazon are “mature,” while a similar probe into Facebook is “not real at all.”
Facebook has also been the tech giant most resistant to progressive demands concerning its content. When Twitter first fact-checked Trump in late May after the president made wildly false claims about the likelihood of election fraud this fall, Zuckerberg refused to follow suit, and then made his comments on Fox News disavowing any role in alerting the public to lies or inaccurate statements by public figures.
And while Twitter was quick to block Trump’s incendiary comment that “when the looting starts, the shooting starts” in the wake of the protests surrounding the killing of George Floyd by a Minneapolis police officer, Facebook left Trump’s statement up.
“I’ve been struggling with how to respond to the President's tweets and posts all day. Personally, I have a visceral negative reaction to this kind of divisive and inflammatory rhetoric,” Zuckerberg wrote on May 29, four days after Floyd’s death triggered ongoing nationwide protests. “But I’m responsible for reacting not just in my personal capacity but as the leader of an institution committed to free expression.”
“I disagree strongly with how the President spoke about this, but I believe people should be able to see this for themselves, because ultimately accountability for those in positions of power can only happen when their speech is scrutinized out in the open,” he said.
Zuckerberg also said that “unlike Twitter, we do not have a policy of putting a warning in front of posts that may incite violence because we believe that if a post incites violence, it should be removed regardless of whether it is newsworthy, even if it comes from a politician.”
Zuckerberg’s specific focus about warnings on posts that incite violence distracted from the fact that Facebook did not put warnings on any kind of posts. It had historically either pulled content down or left it up, a Facebook spokesman told Yahoo News.
On June 28, Zuckerberg announced that was changing.
“We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case … but we’ll add a prompt to tell people that the content they’re sharing may violate our policies,” Zuckerberg wrote. He also announced steps to give users more accurate information about voting and to remove content that could be aimed at suppressing voting.
The changes to Facebook’s policies on hate speech come as Trump’s chances at winning reelection have sharply diminished. The president has received consistently low marks in the polls for his handling of the coronavirus pandemic and the unrest following Floyd’s death. Presumptive Democratic nominee Joe Biden now enjoys a double-digit lead against Trump, according to the Real Clear Politics polling average.
More dramatically, betting odds on the presidential election underwent a radical shift. The betting markets thought Trump was still expected to win in late May, giving him a 50 percent to 43 percent edge over Biden. By June 28, however, the markets expected Biden to win by a margin of 59 percent to 36 percent — a massive landslide the likes of which haven’t been seen in American politics in decades.
“We can certainly interpret that [Facebook is] thinking it’s going to be Democrat-based regulation coming down the line in the next few years,” McGregor said.
Read more from Yahoo News: