Facebook oversight board rulings so far

Upheld

  • The content: A post used a slur to refer to Azerbaijanis in the caption of photos it said showed churches in the country’s capital.

  • Why Facebook removed it: The company said the post violated its policy against hate speech.

  • Why the board agreed: The way the term was used “makes clear it was meant to dehumanize its target.”

Overturned

  • The content: A user in Myanmar posted photos of a Syrian child who had drowned trying to reach Europe, and suggested that Muslims were disproportionately upset by killings in France over cartoon depictions of the Prophet Muhammad compared with China’s treatment of Uyghur Muslims.

  • Why Facebook removed it: The company said the content violated its policy against hate speech.

  • Why the board overruled Facebook: While the comments could be seen as offensive, they did not rise to the level of what Facebook considers hate speech.

  • The content: An Instagram post about breast cancer awareness from a user in Brazil showed women’s nipples.

  • Why Facebook removed it: The company’s automated content moderation system removed the post for violating a policy against sharing nude photos. Facebook restored the post after the oversight board decided to hear the case, but before it ruled.

  • Why the board overruled Facebook: Facebook’s policy on nudity contains an exception for “breast cancer awareness.” The board added that the automated removal showed a “lack of proper human oversight which raises human rights concerns.”

  • The content: A user posted a quote that the person misattributed to Nazi propagandist Joseph Goebbels.

  • Why Facebook removed it: The company said it violated its policy against “dangerous individuals and organizations.”

  • Why the board overruled Facebook: The board said the post did not promote Nazi propaganda but criticized Nazi rule.

  • The content: A user in France falsely claimed that a certain drug cocktail could cure Covid-19 and berated the French government for refusing to make the treatment available.

  • Why Facebook removed it: The company said the post violated its policy against misinformation that could cause real-world harm, arguing that it could lead people to ignore health guidance or attempt to self-medicate.

  • Why the board overruled Facebook: The post did not represent an imminent harm to people’s lives because its aim was to change a government policy and it did not advocate taking the drugs without a doctor’s prescription.

  • The content: A post in a Facebook group for Indian Muslims included a meme that appeared to threaten violence against non-Muslims. It also called French President Emmanuel Macron the devil and urged a boycott of French goods.

  • Why Facebook removed it: The company said the post contained a “veiled threat” and violated its policy against inciting violence.

  • Why the board overruled Facebook: The post, while incendiary, did not pose an imminent risk of violence and its removal overly restricted the user’s freedom of expression.