Don't give Facebook and YouTube credit for shrinking Alex Jones' audience

Internet platforms were making money by placing Infowars’ content in front of those who would not otherwise view it

Conspiracy theorist and Infowars.net founder Alex Jones has been banned from Facebook and YouTube and other internet platforms.
Conspiracy theorist and Infowars.net founder Alex Jones has been banned from Facebook and YouTube and other internet platforms. Photograph: Jim Bourg/Reuters

It is an iron law of the internet that any attempt to censor or suppress information will inevitably result in the increased dissemination of that information. Just as the laws of thermodynamics undergird everything we know and can learn about the physical world, this rule – known as the Streisand Effect – sets the table for every debate around speech on the internet.

It was thus only to be expected that when Facebook, YouTube and other internet platforms decided to ban conspiracy theorist Alex Jones’s fake news broadcasts in early August, Infowars’ traffic and reach would only increase.

“The more I’m persecuted, the stronger I get,” Jones reportedly said in response to the mass banning. “It backfired.”

But a new report by the New York Times suggests that, in fact, traffic to Infowars’ website and video broadcasts has fallen precipitously in the wake of his banishment from Facebook and YouTube. According to the Times’ analysis, Jones’ reach went from 1.4 million visitors each day to just 715,000, and a temporary spike in traffic to the Infowars website did not replace the approximately 900,000 video views that Facebook and YouTube were responsible for each day for the three weeks before the bans. (Jones disputed the Times’ analysis on Twitter, a platform that bucked the trend of banning Jones, but also has a significantly smaller reach than YouTube or Facebook.)

That the de-platforming of Alex Jones is reducing the number of people exposed to his particularly noxious brew of conspiracy theories, hate mongering, misinformation, harassment and other bile on a daily basis is certainly welcome news.

But before we give Facebook and YouTube too much credit for reducing Jones’ reach, it’s important to look at the equation from the other side: until one month ago, Facebook and YouTube combined were apparently responsible for doubling Infowars’ audience.

They were not just serving as passive platforms, hosting content for those who sought it out

They were not just serving as passive platforms, hosting content for those who sought it out. They were placing Infowars before the eyeballs of people who would not otherwise consume it, and they were making money off that transaction.

“I think that what is reflected in the traffic going down is related to the power of social media to broadcast content to new audiences,” said Joan Donovan, a lead researcher at Data & Society’s Media Manipulation Initiative. “What we are seeing now is more of a reflection of the fanbase as it stands rather than a reflection of how the recommendation algorithm is serving the content to new audiences.”

In other words, Alex Jones was a small man, standing on the shoulders of internet giants in order to punch above his weight.

The symbiotic relationship between Infowars and the social media platforms was particularly potent because of the platforms’ incentive structure (they want to keep people on their platforms where they will watch advertisements) and the algorithms they use to achieve that objective. Rather than expecting users to actively seek out information or entertainment, Facebook and YouTube feeds them an algorithmically determined stream of whatever content the algorithms calculate is most likely to keep the user from clicking away.

“These algorithms work really well if you are into a subculture of music or really love scented candles and want to watch reviews of scented candles,” Donovan said. “It’s not problematic because you are seeing the things you are interested in, you consume it, and you move on with your day.”

“But when you’re doing it with news, it does have a different effect on political polarization,” Donovan added. “If you’re looking at extremist videos, particularly stuff related to the alt-right, [the algorithm] sees that you typed in that keyword, and it wants to keep serving you stuff related to that keyword.”

Donovan said that Infowars was particularly suited to Facebook’s and YouTube’s algorithms, because they are looking for “freshness and relevance”. Jones broadcasts for hours each day, and Infowars then slices and dices his rants into short videos designed for social media platforms.

“It really tips the recommendation system towards Infowars because they have content about almost everything you can imagine, as well as having content that is new online,” she said. “Very few media makers can produce at that kind of rate online.”

It is possible to have an ethic and a process of social media moderation that mirrors the ethic and practice of journalists

Joan Donovan, Data & Society

Indeed, since publication of the Times’ article on Tuesday morning, Infowars has shared at least five videos disputing it on Twitter.

Facebook and YouTube are, of course, not solely responsible for amplifying Jones and his ilk. The traditional media are also grappling with the question of how best to cover the alt-right and other extremists, many of whom court media attention in order to hijack our platforms for their own ends. Monday’s contretemps over the New Yorker’s (since retracted) decision to invite white nationalist Steve Bannon to headline its annual ideas festival was just one example of how frequently the traditional news media err.

But more editors and journalists are discussing ideas such as “strategic silence” and publications at the very least take responsibility for their editorial decisions, rather than blaming an algorithm, or their readers.

“It is possible to have an ethic and a process of social media moderation that mirrors the ethic and practice of journalists,” Donovan said. “If [the internet platforms] had paid attention to Alex Jones when he hit 10,000 followers, or 20,000, or 50,000, and done consistent content review to understand if the content contained conspiracy theories or targeted harassment, they then would have had a handle on the issue, and it wouldn’t have ballooned into this PR crisis.”