How social media misinformation wins — even if you don't believe it

Disinformation is propaganda. It can also be incredibly effective, even when we know it's not true.

Almost 15 years after Stephen Colbert introduced the term "truthiness" into the modern lexicon, news consumers find themselves awash in a deluge of misinformation, fake news, and alternative facts. The problem is two-fold: if disinformation fits their already established worldview, people believe it. At the same time, if disinformation doesn't fit their worldview, it affects people's views of those sharing it. Either way, it contributes to widespread political divisiveness and pits Americans against each other.

At this point, we're all familiar with the prevalence of misinformation online. For instance, as the Australian bushfires raged, the hashtag #arsonemergency blamed arson — not climate change — for the blazes. Researchers at Queensland University of Technology identified that many of the accounts pushing the arson narrative were trolls or bots. During the 2016 election, the Russian Internet Research Agency created hundreds of fake Facebook pages with names like "Blacktivist," "Born Liberal" and "Army of Jesus". There's every reason to think they'll work from the same playbook during the 2020 election season, as evidenced by the reported recent hacking of the Ukranian gas company with ties to Hunter Biden. Facebook, used by 70 percent of American adults, recently stated the company would not attempt to sort through the veracity of claims in political advertisements.

To be sure, the billions of global social media users on all platforms need to be careful with the information they consume and share. Fact-checking websites such as snopes.com or politfact.com can help discern outright falsehoods. Sites such as All Sides help illustrate how different news outlets frame their stories. For Twitter users, bot detectors such as botometer or bot sentinel allow anyone to see what conversations bots are pushing.

However, the success of digital propaganda rests less on whether social media users believe the actual information and more on how these messages change people's perceptions of each other. As a social media expert at DePaul University, I research the ways that people encounter and process online information. My current projects, building on a theory of skewed information diffusion, argue that fake news isn't effective because it's convincing, it's effective because it reinforces our own assumptions and undermines our ability to respect and trust those with whom we may disagree.

Bots and trolls are insidious to an influence process called social proof, the process when individuals find behaviors and beliefs more appropriate when they see other people engaging in these behaviors. Social proof plays on the desire for interpersonal connection. Yet manufactured connections can be used by disinformation campaigns to sow discord.

Trolls change the perception of social proof by promoting messages they do not necessarily believe in; bots do so by amplifying messages that have little actual social consensus. When the memes and posts of disinformation campaigns align with currently held beliefs, the increased social proof can push social media users to increasingly extreme positions. If Twitter users see a hashtag such as #NancyPelosiFakeNews trending on Twitter, they may believe this trend is due to actual voter concerns with the Speaker of the House, not realizing that in the same timeframe it is also the top trending hashtag used by known trollbots.

When people see others with divergent opinions sharing these low-quality memes, they view that person as uninformed and easily duped. Identifying the messages shared by others as disinformation exacerbates this problem, making people on different sides of an issue hold increasingly negative views of the other side.

Dave Karpf, Associate Professor of Media and Public Affairs at George Washington University, argues that digital propaganda can affect our democracy. Citizens are unlikely to change their politics because of a bot-driven tweet storm or memes flowing from a Facebook group, but when all information can be derided as fake, it becomes difficult to have reasonable disagreements.

Very troubling is that many people are particularly bad at determining the quality and truthfulness of a message when the issue is personally important. The information social media users want to share and consume the most are issues where they are ego-involved, or personally passionate. Unfortunately, when people really care about an issue, even with a message that is poorly constructed or false, people may fill in the blanks with what they already believe. This cognitive bias has little to do with overall intelligence; recent studies show ego-involvement had a greater effect than numeracy in how messages were perceived.

Social media has provided a way for people and groups to be heard and connect in ways never before possible. Yet, bombardment with false information leads to an environment of mistrust. When people struggle to know if anything they see online is real, information and evidence become equally meaningless.

Still, there is evidence that discernment can be a deciding factor. A recent study by researchers at Nanyang Technological University suggests that if users recognize fake news as false, they often simply ignore it.

Social media users working to determine truth and falsehood, or going on a "disinformation diet," will likely have little effect on the discord disinformation campaigns are attempting to sow. While the truth is important, social media users also need to be thoughtful about how they engage with others sharing different viewpoints. The purpose of disinformation campaigns is to exacerbate those divisions between people on different sides of an issue. It is important to not only seek the truth but also resist letting those campaigns falsely draw us as fellow citizens apart.

Heading into the contentious 2020 election season, it is wise to be skeptical, but also understanding of the emotions supporting individual beliefs.

Want more essential commentary and analysis like this delivered straight to your inbox? Sign up for The Week's "Today's best articles" newsletter here.

More stories from theweek.com
Steven Mnuchin's wife, Louise Linton, sides with Greta Thunberg over her husband, deletes post
Trump and Rudy Giuliani slam Bolton, question his manhood after book excerpt report
Bolton's lawyer blames the White House for leaking damaging book excerpts