Study: Deepfakes weaponized in Russia's war against Ukraine

Deepfake videos on social media have become weapons of war, undermining trust and fueling conspiracy theories during Russia's ongoing invasion of Ukraine, according to new research from scholars at University College Cork in Ireland.

"For the first time we’ve seen deepfake propaganda and misinformation that has attempted to influence a war," said the report, led by John Twomey and published in the journal PLOS ONE on Oct. 25.

Deepfakes are videos constructed by artificial intelligence to manipulate viewers into believing events took place that did not actually occur. Although fake, the videos appear convincing and are often produced with the intention to imitate a real person.

Twomey and his colleagues examined over 4,800 posts on X (formerly Twitter) that discussed deepfakes over the first seven months of 2022, using a method known as qualitative analysis.

They found that deepfakes erode viewers' confidence in the authenticity of war-related footage, leading to a loss of trust in all viewed content.

"Unfortunately, the majority of this type of Deepfake discourse during the war consisted of unhealthy skepticism fueled by deepfakes," the study said.

One prominent deepfake deployed in the war was a video that falsely showed President Volodymyr Zelensky surrendering to Russia. Another depicted Russian dictator Vladimir Putin declaring peace. Researchers also mentioned the "Ghost of Kyiv" footage, which claimed to show a Ukrainian fighter pilot but was in fact taken from a video game.

The study cautioned that deepfakes such as these undermine trust in legitimate videos, with viewers more frequently mislabeling authentic media as deepfakes.

While deepfakes represent a new form of media, misinformation has long been a weapon in Russia's war against Ukraine.

Read also: David Kirichenko: The continued menace of Russian disinformation

We’ve been working hard to bring you independent, locally-sourced news from Ukraine. Consider supporting the Kyiv Independent.