Meta Will Offer Facebook Users Greater Control over Content Moderation

National Review has learned that the social-media giant Meta will roll out policies in the coming weeks that afford users greater freedom with their Facebook timelines. Users will be able to decide whether certain content is “demoted,” and they will also have greater control over fact-checked content, though the fact checks cannot be turned off entirely.

A Meta spokesperson told National Review that the new policies will give “people on Facebook even more power to control the algorithm that ranks posts in their Feed.”

“We’re doing this in response to users telling us that they want a greater ability to decide what they see on our apps. This builds on work that we’ve been doing for a long time in this area and will help to make user controls on Facebook more consistent with the ones that already exist on Instagram,” the spokesperson explained.

As a general rule, Meta ranks Facebook posts based on certain assumptions of what will be most valuable to each individual user, in theory improving each user’s feed.

However, Meta has other guidelines that see certain content removed entirely and certain content kept but moderated, or “demoted.”

There are a number of buckets of Facebook content that are moderated by Meta without being removed. The four which will now be subject to user control is low-quality content such as scams, unoriginal content from websites that want to boost ad revenue, sensitive content that is violent or graphic, and content that is fact-checked and deemed not entirely true.

Users in the U.S. and globally will be able to stop the first two buckets of content from being demoted. They will also be able to keep sensitive content at the default level of moderation or turn moderation up.

Regarding the fourth bucket, users in the U.S. specifically will have the ability to reduce how large the fact-check notice appears on a fact-checked post. A user can make it rather small or make it so large that it covers the entire original post.

National Review asked Meta if the company has heard criticism of social-media companies from people on the right who believe the moderators are more aggressive with posts that contain conservative views. The company replied in the affirmative.

It pointed to the fact that two of the fact-checkers it partners with in the U.S. are right-leaning. CheckYourFact.com is associated with the conservative media company the Daily Caller. The Dispatch is also a partner.

Meta outsources its fact-checking “accreditation” to a group called the International Fact-Checking Network (IFCN), a subdivision of the nonprofit media institute Poynter. A total of 102 fact-checking organizations are signatories of the IFCN code of principles.

Mainstream fact-checkers have been repeatedly criticized by the right for having a liberal slant. Meta, however, thinks there is balance to the organizations it partners with.

The company told National Review that on one piece of content, a user might find multiple fact-checker ratings. Additionally, if one fact-checker rates something false and the other rates it partly false, Meta would predominately promote the partly false rating.

The company did confirm that the Facebook defaults mean that content is moderated to a greater extent until a user actively makes the decision to reduce the moderation.

A route Meta declined to take would have been the converse: making content as unmoderated as possible until a user intervened.

More from National Review