No, We’re Not Living in a Post-Fact World

Since President Donald Trump’s election, if not before, a conventional wisdom has emerged that Americans are living in a “post-truth” age. Whether they’re talking about Trump’s dealings with Ukraine or debates about climate change, many journalists, scholars and observers now commonly declare that voters are consuming fake news and rejecting facts, putting the American democratic experiment at serious risk, particularly as we approach another election.

But here’s a bit of good news for the new year: This account—at best—overstates the case. Evidence we’ve gathered over the previous four years—involving more than 10,000 participants and spanning from the 2016 election to well into the Trump presidency—illustrates that the most pessimistic accounts of the decline of facts are, well, not entirely factual. We found that when presented with factually accurate information, Americans—liberals, conservatives and everyone in between—generally respond by becoming more accurate.

Our results, which have been published in multiple journal articles, were particularly stark when we aggregated our 13 studies: 32 percent of people who were not presented with factually accurate information later expressed accurate beliefs, compared with nearly 60 percent of people who were presented with factually accurate information and went on to express accurate beliefs. In other words, facts almost doubled the share of accurate beliefs.

One of the most pessimistic claims about facts in American democracy is that when people see factual information, they respond by becoming less accurate. In one famous 2010 study, factual corrections about the absence of weapons of mass destruction in Iraq prompted conservatives to become more convinced that WMD were present. This behavior is known as the “backfire effect,” and it has been documented in a handful of studies. It would indeed be worrisome if, when presented with accurate information that conflicted with their political beliefs, Americans simply rejected it.

We decided to investigate the prevalence of the backfire effect during the 2016 election, conducting a set of experiments on Americans of all political stripes. We used a wide variety of platforms, including nationally representative online samples (one of which was administered by Morning Consult) and telephone-based studies, which helped recruit older, generally more conservative Americans. Across all of our studies, participants read misstatements by various politicians, including presidential candidates from both parties, on issues ranging from climate change to foreign policy to crime rates. To maximize the chance of inducing backfire, we tested many politically contentious issues, for which partisan positions tend to be more fixed. We then randomly assigned some participants to read factual corrections to the misstatements. Afterward, we asked all participants whether they still believed the initial misstatement.

Our results were unambiguous: Those who saw factual corrections were substantially more likely to express factually accurate beliefs than those who did not see corrections. By and large, the average person responded to the corrections by bringing their views closer in line with the facts. This was true across ideologies and across parties. It was also true when Democrats confronted misstatements made by Democratic politicians and when Republicans confronted misstatements made by Republican politicians. Supporters of then-candidate Trump were no different. When we ran a study on the night of his first presidential debate with Hillary Clinton, we found that a correction to a misstatement Trump uttered during the debate caused his supporters to become more accurate. Specifically, along a five-point scale, the average Trump supporter who had seen a correction was half a scale point more accurate than the average Trump supporter who had not.

We continued our research after Trump’s election and inauguration. During his 2019 State of the Union address, Trump described the southern U.S. border as “lawless.” Yet, as fact-checkers pointed out on the night of the speech, the volume of border-crossing had declined dramatically. In a study conducted that night, we presented some participants with a factual correction. When we asked all participants if they believed there was a surge of illegal crossings, those who had seen the correction were more likely to believe, correctly, that there was not. We observed particularly large gains in accuracy among conservatives who saw a correction—suggesting that Trump does not have magical abilities to dispel beliefs in factually accurate information. Indeed, corrections increased the accuracy of the average conservative by three-quarters of a point along a seven-point scale.

Our findings were not entirely rosy. In one study, to test Trump’s unique capacity to sow belief in falsehoods, we took a set of misstatements by Trump and attributed those same misstatements, at random, to Senate Majority Leader Mitch McConnell. When the exact same fact-checks were applied to the exact same misstatements—with only the person purportedly delivering the misinformation changed—the fact-checks applied to the president produced smaller gains in factual accuracy. So, while not immune to factual correction, Trump’s statements appear to be more resistant to it than those from at least one other political leader in his own party.

We also searched for, but failed to find, evidence showing that factual corrections alone cause people to change their political views. Those who believe empirical evidence should govern political attitudes might find this disappointing.

On the one hand, our evidence cuts against prior findings, including the original backfire paper. (To their enormous credit, the authors of that paper have worked with us in subsequent studies, including two that are discussed in our book.) Our work relies on far larger samples and tests a much wider variety of issues than previous investigations in this area. On the other hand, our work is part of an emerging consensus that concerns about “post-truth” politics could be overblown: Research now shows that fake news is much less prevalent than commonly feared, and other scholars on the hunt for the backfire effect have found results similar to ours.

Given all this, what explains the widespread belief in a “post-truth” world? We can think of several explanations. First, some of the purported anxiety about facts is likely standing in for anxiety about political disagreement. It is tempting to believe that your opponents are too irrational to reason with. (Tempting, but probably wrong.) Second, those who spread misinformation, including no shortage of today’s politicians, are often memorable. Psychologists have shown that the vividness of a particular case causes us to overestimate the frequency of that case. We can all easily call to mind a wild-eyed relative who traffics in conspiracy theories, but we tend to discount our relatives whose views are more grounded. Finally, there is some evidence that, at least on Twitter, lies spread more quickly than truths.

Still, none of this means America’s information landscape functions perfectly. Even though fact-checks generally improve accuracy, there is little evidence that Americans are consuming such fact-checks in sufficiently large numbers. It is incumbent on the media to aggressively correct—without hesitation or fear of backfire—those politicians who spread misinformation. And it is further incumbent on the public not just to be aware that fact-checks exist but to read them.