Exclusive: Facebook doesn't want to decide 'what's true and what's false,' exec says

Anti-vax myths, distorted Nancy Pelosi videos, a conspiracy theory that a recent mass shooter was a supporter of presidential candidate Beto O’ Rourke — misinformation abounds on Facebook (FB). In an exclusive interview, top Facebook executives said the company has made progress addressing false posts but still struggles to identify them, especially in the most high-stakes regions where misinformation can lead to deadly violence.

“We don't want to be in the position of determining what is true and what is false for the world,” says Monika Bickert, the head of global policy management, which sets the rules for the site’s 2.4 billion users. “We don't think we can do it effectively.”

“We hear from people that they don't necessarily want a private company making that decision,” she adds.

Reluctant to judge veracity on its platform, Facebook partners with fact-checking organizations that vet posts, an arrangement that began after the 2016 presidential election. But Bickert acknowledged that the company often lacks such partnerships in violence-prone regions.

“The sad reality is, in the places in the world where you are most likely to have on the ground violence, those are often the same places where it's hard to have a fact-checking partner, or even a safety organization, tell us what the real situation is on the ground,” she says.

Last year, U.N. Human Rights experts examining violence perpetrated against Rohingya Muslims in Myanmar said that social media played a “determining role” in the conflict. Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, specified: “As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”

A Facebook-commissioned report, released last November, acknowledged that the company had failed to prevent the platform from “being used to foment division and incite offline violence” in Myanmar.

WASHINGTON, DC - FEBRUARY 26:  Facebook Head of Policy Managment Monika Bickert participates in a discussion and question-and-answer session about 'Internet Security and Privacy in the Age of Islamic State' at the Washington Institute for Near East Policy February 26, 2016 in Washington, DC. A former U.S. attorney at the Justice Department, Bickert began work at Facebook in 2012 as lead security counsel, advising the company on matters including child safety and data security.  (Photo by Chip Somodevilla/Getty Images)
Facebook Head of Policy Managment Monika Bickert participates in a discussion and question-and-answer session about 'Internet Security and Privacy in the Age of Islamic State' at the Washington Institute for Near East Policy February 26, 2016 in Washington, DC. (Photo by Chip Somodevilla/Getty Images)

“If we have misinformation where a safety partner is able to confirm that can contribute to imminent or ongoing violence on the ground, then we will remove it,” Bickert says.

Concerns about false and inauthentic posts on Facebook reached a fever pitch after the 2016 presidential election, the outcome of which some have attributed to a disinformation campaign on the platform carried out by a Russian intelligence agency. The Mueller Report, released in April, detailed Russia-operated Facebook Groups like “United Muslims of America” and “Being Patriotic” that each had hundreds of thousands of followers.

This picture taken on December 18, 2018 shows Myanmar youths browsing their Facebook page at an internet shop in Yangon. - Facebook has removed hundreds of additional pages and accounts in Myanmar with hidden links to the military, the platform said on December 19, as the company scrambles to respond to criticism over failures to control hate speech and misinformation. (Photo by Sai Aung MAIN / AFP)        (Photo credit should read SAI AUNG MAIN/AFP/Getty Images)
This picture taken on December 18, 2018 shows Myanmar youths browsing their Facebook page at an internet shop in Yangon. (Photo by Sai Aung MAIN / AFP) (Photo credit should read SAI AUNG MAIN/AFP/Getty Images)

The site drew criticism early this year for allowing opponents of vaccination to spread false information about the treatment, and in May, for permitting distorted videos of U.S. House Speaker Nancy Pelosi (D-CA) to be viewed millions of times. (Facebook reduced the distribution of such videos and attached a warning to them, but did not remove them.)

In an exclusive interview at Facebook’s Menlo Park headquarters, Yahoo Finance Editor-in-Chief Andy Serwer spoke with the three executives who oversee content at Facebook — Bickert; John DeVine, VP of Global Operations; and Guy Rosen, VP of Integrity.

The top executives said the company has come a long way in addressing misinformation since the 2016 election.

“We've already made a lot of progress on misinformation,” says Rosen, who oversees the development of products that identify and remove abusive content on the site.

Last Thursday, Facebook launched a partnership with the World Health Organization (WHO) that will direct users searching for information on vaccines to the WHO’s website.

“There's always going to be continued challenges,” Rosen says. “And it is our responsibility to make sure that we are ahead of them and that we are anticipating what are the next kind of challenges that bad actors are going to try to spring on us.”

Inside Facebook
Inside Facebook

Andy Serwer is editor-in-chief of Yahoo Finance. Follow him on Twitter: @serwer.

Read more:

Facebook's Zuckerberg and Sandberg are this involved with the company's content issues

Negative interest rates are coming and they are downright terrifying

Jamie Dimon: Donald Trump deserves ‘some’ credit for the strong economy

Advertisement