As part of wide-ranging Senate testimony Tuesday on a trove of leaked internal documents, former Facebook employee Frances Haugen said the social network has had a “destructive impact” on society that led to real-life violence in Myanmar and Ethiopia.
In a new interview with Yahoo Finance, Facebook (FB) VP of content policy Monika Bickert says the company is working to improve its content moderation capabilities in regions outside of the U.S. and has already done so in Ethiopia.
“Language is a challenge,” Bickert said. “And one of the reasons it's a challenge is because when you're building technical tools to find abusive content or violating content, you need examples, so that you can train the machines to go and find this stuff.”
The internal documents, which Haugen provided to The Wall Street Journal and “60 Minutes,” show that Facebook has been unable to address the spread of speech meant to incite sectarian violence in regions like Myanmar and Ethiopia because it doesn’t have enough employees who speak the local languages to properly moderate content.
“What we saw in Myanmar and are seeing in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it,” Haugen said during her testimony before the Senate Commerce Committee’s Subcommittee on Consumer Protection on Tuesday.
But Facebook is gradually improving its content moderation in foreign languages by using machine learning, according to Bickert. “We get better and better over time, and often we'll start doing something in one language, and then as we build the capability … we will roll out to other languages,” she said. “... And we'll continue to get better at the technical components of this.”
Beyond that focus on machine learning, Bickert says Facebook has significantly invested in what it refers to as its network of trusted partners.
“These are organizations around the world who are focused on safety issues, or how speech can affect certain communities in different areas that we can take those learnings in and make sure we're doing what we can to keep the site safe,” she said.
In Ethiopia, Bickert says, Facebook is working with nongovernmental organizations to get a better grasp of the situation and remove any harmful content.
“We have special teams who proactively...identify who are the NGOs, or who are the academics or others who could let us know what the trends might be, what the risk might be,” she said. “And then sometimes we actually put bespoke policies in place where we will say, this is a term or this is a trend that we are seeing in this region, we need to make sure we're on top of it.”
However, the Wall Street Journal reported that Facebook doesn’t spend nearly as much time policing content outside of the U.S. as it does domestic content. Citing the documents leaked by Haugen, The Journal reported that of the 3.2 million hours employees and contractors spent locating and removing false and misleading information only 13% of that time was spent addressing such problems outside of the U.S.
But Bickert says that the company has also dramatically improved on its ability to moderate content and that it is getting better at doing so.
“The nature of social media is that you have people sharing content real time, and so we are not perfect at catching everything that comes through the door,” she said. “But we’ve invested significantly, billions of dollars, in building a safer platform and that includes building technical systems that over time will allow us to do this work better and better.”
More from Dan
Got a tip? Email Daniel Howley at email@example.com over via encrypted mail at firstname.lastname@example.org, and follow him on Twitter at @DanielHowley.