Google Hopes to Inoculate Internet Users Against Misinformation with Expanded 'Pre-bunking' Campaign

Stock photo of Google logo on office
Stock photo of Google logo on office


Google is growing its use of “pre-bunking” in the company’s efforts to combat misinformation.

Google plans to expand a campaign against online misinformation to Germany this week, and later India, as first reported by the Associated Press. The strategy, known as “pre-bunking” or “attitudinal inoculation,” aims to train people on how to recognize false information and manipulated facts on the internet before they even encounter them.

In short videos and photos—shown across platforms like YouTube, TikTok, Twitter, and Facebook in standard advertising slots—Google will continue its push to make pre-bunking a go-to method for dispelling disinformation.

Read more

A massive study published in August 2022 demonstrated the potential value of pre-bunking among a group of nearly 30,000 participants. After viewing pre-bunking videos that highlighted well-known disinformation tactics like emotional appeals, false dichotomies, and ad hominem attacks, the researchers found that people were 5% better, on average, at identifying these tricks when shown a variety of social media posts.

That study was conducted by researchers from Cambridge and Bristol Universities, as well as partners from Alphabet-owned YouTube and Google’s internet threat research arm, Jigsaw. Google has also run smaller tests of its own on U.S. audiences, focused on covid-19 vaccine misinformation. Beginning a few months ago, the company began to employ the pre-bunk strategy on a wider-scale, testing the method beyond closed research studies and in the real world.

In fall 2022, the company started up tests in Poland, the Czech Republic, and Slovakia focused largely on combatting widely perpetuated, xenophobic, false claims about Ukrainian refugees (e.g. that refugees are criminal or steal jobs and housing). There, the company used videos, which offered viewers ways to recognize unreliable sources of information and were intended to increase awareness of efforts to manipulate public perception.

In total, these videos were watched 38 million times, Google reported in a Jigsaw blogpost, published Monday—which is equal to more than half of the combined population of all three countries. Further, the company’s internal researchers determined that those who watched the pre-bunk videos were more likely to easily pick out misinformation techniques and less likely to pass along online lies to others.

Based on the Eastern Europe pilot, Google has determined its pre-bunk methods are successful enough to warrant growing the campaign across new borders. “Learnings from this campaign — including efforts to simplify critical messages and iteration on the survey questions to effectively measure knowledge gain — will inform our future experiments as we seek to better understand the effectiveness of prebunking in the wild,” wrote Beth Goldberg, Jigsaw’s head of research, in the blog.

In Germany, the AP reports that the tech giant’s campaign will incorporate both still images and videos highlighting how easy it is to share misinformation. One timely example of misinformation that could be stymied through the method, provided by the AP: a video of a 2020 explosion in Beirut has been shared widely across European social media under the false explanation that it was an incident triggered by this months’ devastating earthquake in Turkey. If a video were to circulate highlighting that this often happens post-disasters, users might double-check the validity of earthquake content that they see.

Google has yet to specify any of the details of its forthcoming project in India, beyond that it will be launched “later this year.”

The company has struggled with misinformation in its search results, and of course, on YouTube. Counter to other methods of combatting disinformation’s spread, like increased content moderation and post-by-post fact-checks or addressing the underlying algorithms that direct people to increasingly extremist content, pre-bunking is likely a less resource-intensive strategy for tech companies. It also doesn’t require that these corporations directly weigh in on individual, highly politicized issues—just that they attempt to arm their user-base with the tools to recognize when a claim might not be all that it seems.

But the strategy does have its downsides.

For one, cultural differences and ensuring cultural relevancy could be super critical in creating impactful pre-bunk content. Though overall, Google’s Jigsaw found that its Eastern European campaign was effective, the biggest impact was seen in Poland. In the Czech Republic outcomes were more mixed. In Slovakia, the campaign had no significant, observable effect. The company noted this could be because the Slovakian videos were dubbed as opposed to recorded specifically for that market, but Jigsaw noted that more research is needed.

Unsurprisingly, the videos were also found to be more effective among people who watched the whole thing—a challenge for even the most popular and prolific influencers on the web. Content volume is high and attention spans are short.

Beyond the single test, it’s also widely accepted that viewing a single pre-bunk video doesn’t lead to lasting shifts in attitudes or awareness. People need repeated inoculations or “booster” videos to keep up their skepticism and media literacy.

And Google isn’t exactly a perfect arbiter of what’s true and what’s false. Last week, the company even spread some misinformation of its own in a promotion for its new AI tool.

Then, there’s the potential for the pre-bunking format itself to be co-opted into disinformation campaigns, or simply be a stand-alone manipulation. Deciding what counts as disinfo and what the hallmarks of bad facts are is a political choice that aims to sway public opinion, whether or not tech companies see it that way.

In the lead up to the Russian invasion in Ukraine, the U.S. State Department tried to do a sort of pre-bunk of its own—warning people to look out for professionally produced Russian propaganda videos blaming terrorist attacks on the Ukraine. However, those scary, rumored, high-budget propaganda videos never really materialized on social media platforms.

But regardless of the challenges, researchers see pre-bunking as a potential part of the fight against misinformation moving forward. As Jon Roozenbeek, one of the Cambridge University researchers on the August 2022 study, once told Gizmodo, “What I don’t think would be good... is if [Google and YouTube] just said, ‘Well don’t worry about our algorithms, we’ll just pre-bunk everything.’”

More from Gizmodo

Sign up for Gizmodo's Newsletter. For the latest news, Facebook, Twitter and Instagram.

Click here to read the full article.