Facebook finds 111 accounts responsible for majority of anti-vaccine content

<p>Facebook report raises issues about vaccine misinformation</p> (REUTERS)

Facebook report raises issues about vaccine misinformation

(REUTERS)

Facebook has found 111 accounts responsible for a majority of the anti-vaccine content found on the social media site.

The finding follows an internal study of vaccine misinformation shared by Facebook users, which was seen by the Washington Post.

While false statements surrounding vaccines are already banned, thousands of pieces of content were described as being in a “grey area” for algorithms and moderators.

The 111 accounts responsible for anti-vaccine content were not named in the report, according to the Post, and were found by placing the firm’s US users into 638 “population segments” or groups.

Facebook did not identify which users were in the groups, or the reasoning for their categorisations — although they were said to be around 3 million in size.

Of those segments, as few as 10 groups were found to contain 50 percent of all anti-vaccine content on the platform — otherwise categorised as posts with “vaccine hesitancy”, or “VH”.

Read more:

And in the segment with the most anti-vaccine content, only 111 users were found to be responsible for half of all vaccine hesitant content found across Facebook.

It was not clear if Facebook took action against the 111 accounts identified for sharing “vaccine hesitancy”, despite the firm finding a connection to the QAnon conspiracy.

According to the Post, Facebook researchers wrote that “It’s possible QAnon is causally connected to vaccine hesitancy,” with mistrust also shown towards authority in the QAnon online community.

Researchers were also reportedly worried that “VH” content — although not found to have broken rules by Facebook’s algorithm — could be harmful for certain groups of people.

“While research is very early, we’re concerned that harm from non-violating content may be substantial,” Facebook’s study said.

Recently under fire for allowing false claims about coronavirus to be shared on its site, Facebook was last year found to have played a part in the spread of a misinformation-filled documentary called “Plandemic”.

Facebook saysit has removed both vaccine disinformation and content connected to the QAnon conspiracy following the approval of a number of vaccines in the UK and US, and the storming of the US Capitol, in recent months.

The Independent has approached Facebook for comment.