Key takeaways from new reports on Russian disinformation

Russians seeking to influence U.S. elections through social media had their eyes on Instagram and the black community.

These were among the findings in two reports released Monday by the Senate intelligence committee. Separate studies from University of Oxford researchers and the cybersecurity firm New Knowledge reveal insights into how Russian agents sought to influence Americans by saturating their favorite online services and apps with hidden propaganda.

Here are the highlights:

INSTAGRAM'S "MEME WARFARE"

Both reports show that misinformation on Facebook's Instagram may have had broader reach than the interference on Facebook itself.

The New Knowledge study says that since 2015, the Instagram posts generated 187 million engagements, such as comments or likes, compared with 77 million on Facebook.

And the barrage of image-centric Instagram "memes" has only grown since the 2016 election. Russian agents shifted their focus to Instagram after the public last year became aware of the widespread manipulation on Facebook and Twitter.

NOT JUST ADS

Revelations last year that Russian agents used rubles to pay for some of their propaganda ads drew attention to how gullible tech companies were in allowing their services to be manipulated.

But neither ads nor automated "bots" were as effective as unpaid posts hand-crafted by human agents pretending to be Americans. Such posts were more likely to be shared and commented on, and they rose in volume during key dates in U.S. politics such as during the presidential debates in 2016 or after the Obama administration's post-election announcement that it would investigate Russian hacking.

"These personalized messages exposed U.S. users to a wide range of disinformation and junk news linked to on external websites, including content designed to elicit outrage and cynicism," says the report by Oxford researchers, who worked with social media analysis firm Graphika.

DEMOGRAPHIC TARGETING

Both reports found that Russian agents tried to polarize Americans in part by targeting African-American communities extensively. They did so by campaigning for black voters to boycott elections or follow the wrong voting procedures in 2016, according to the Oxford report.

The New Knowledge report added that agents were "developing Black audiences and recruiting Black Americans as assets" beyond how they were targeting either left- or right-leaning voters.

The reports also support previous findings that the influence operations sought to polarize Americans by sowing political divisions on issues such as immigration and cultural and religious identities. The goal, according to the New Knowledge report, was to "create and reinforce tribalism within each targeted community."

Such efforts extended to Google-owned YouTube, despite Google's earlier assertion to Congress that Russian-made videos didn't target specific segments of the population.

PINTEREST TO POKEMON

The New Knowledge report says the Russian troll operation worked in many ways like a conventional corporate branding campaign, using a variety of different technology services to deliver the same messages to different groups of people.

Among the sites infiltrated with propaganda were popular image-heavy services like Pinterest and Tumblr, chatty forums like Reddit, and a wonky geopolitics blog promoted from Russian-run accounts on Facebook and YouTube.

Even the silly smartphone game "Pokemon Go" wasn't immune. A Tumblr post encouraged players to name their Pokemon character after a victim of police brutality.

WHAT NOW?

Both reports warn that some of these influence campaigns are ongoing.

The Oxford researchers note that 2016 and 2017 saw "significant efforts" to disrupt elections around the world not just by Russia, but by domestic political parties spreading disinformation.

They warn that online propaganda represents a threat to democracies and public life. They urge social media companies to share data with the public far more broadly than they have so far.

"Protecting our democracies now means setting the rules of fair play before voting day, not after," the Oxford report says.