Don't forget about TikTok amid the national conversation on social media dangers.

Two years ago, as part of a research study that analyzed the tools that white supremacy groups use to push their propaganda, I turned to TikTok, the Chinese government-owned platform wildly popular with American teenagers for such dance crazes as Renegade and “Berries and Cream.” By that point, experts already knew that TikTok also had drawn the attention of extremist organizations such as Patriot Front.

But what my colleagues and I found during our research was truly astounding. I discovered troves of videos of children doing Nazi salutes and using racial slurs, young adults pretending to be members of the Ku Klux Klan, and militia members showing off their stockpiled weapons for the "race war."

Steps to address social media threat

My discovery last year while interning with the Anti-Defamation League and my current research adds to congressional, public and media scrutiny of Facebook and other social media platforms and their powerful and unprecedented impact on the youngest members of our society, on our democracy and our everyday life. And it demands answers to the questions that hang over our use of social media:

►What can be done to mitigate the many negative effects that these platforms provoke?

►How can we tighten the regulations, policies and laws that govern the virtual world?

►Do we need more resources to monitor and moderate social media platforms?

►And can we, as citizens, play a useful and effective role in stopping the spread of extremism, violence and intolerance in social media?

To me, a scholar of extremism and a program research associate at the American University’s Polarization and Extremism Research Innovation Lab (PERIL), watching thousands of hours of videos led to an obvious and urgent answer: If you see something on TikTok, or any other social media platform, say something!

The social media app TikTok on an iPhone.
The social media app TikTok on an iPhone.

TikTok has been attracting young Americans since its merger with Musical.ly in 2018. It has become a powerful tool for extremist groups to share information, inspire recruits and marshal its followers. TikTok does not effectively moderate its content and prevent radical groups and drug cartels from manipulating it.

Jared Schroeder: Yes, it’s time to act against Facebook – just don’t put the government in charge

Recently, the Department of Homeland Security confirmed that TikTok was used before the Capitol insurrection on Jan. 6. Extremists posted videos on how to gain entrance into the Capitol as well as other tactical guidance. Virtual tour guides mimicked popular accounts that showcase restaurants and attractions, while alerting users how to enter the Capitol without being detected.

But the Capitol riot was only one example of TikTok’s power to inspire and funnel extremist action – in 2020, it was reported that Mexican cartels were using TikTok to create videos on how to avoid detection when transporting drugs and recruitment videos about the luxurious lifestyle of drug dealers. This comes at a time when there has been a significant increase in drug overdoses across the United States, with TikTok becoming the " 'perfect storm' in which social media normalizes and influences the way (teenagers) view drugs or other topics," as The Wall Street Journal puts it.

Kesa White is a program research associate at American University’s Polarization and Extremism Research and Innovation Lab.
Kesa White is a program research associate at American University’s Polarization and Extremism Research and Innovation Lab.

What is truly concerning is how, for a person who is “on the fence” about attending a demonstration or simply curious about using illegal drugs, just a few seconds of propaganda on TikTok can make the difference between engagement and avoidance. The site’s algorithms are tuned to lure a user into behavior that he or she might previously have avoided.

If you see something, say something

On the positive side, TikTok has somewhat improved its moderation techniques over the past two years, and its community guidelines are seemingly admirable as they discuss and condemn violent extremism, sexual activities and other criminal activities. Yet trying to police millions of videos has proved virtually impossible, and the insurrection videos that I viewed proved that TikTok’s own guidelines have a lot of loopholes. For example, “a tactical guidance” is not classified as "illegal activities and regulated goods" or "violent extremism," and it is relatively easy for someone to create a new profile after an offending one has been banned.

Jill Lawrence: Why I'm still on Facebook, even though it's dividing and inciting America for profit

It is unquestionable that the social media giants need to work with law enforcement agencies and lawmakers and apply proven strategies to curb the wave of domestic online radicalization. Everyone who uses these platforms for entertainment or legitimate business should report videos that violate TikTok’s own guidelines to the company’s monitors. Parents and caregivers must keep a close eye on the videos that their children are watching and posting. And parents need to be aware that TikTok’s addictive content changes every day – even every hour.

Today, tomorrow and every day, just like a suspicious person at an airport or an abandoned parcel on the metro: If you see something, say something!

Kesa White is a program research associate at American University’s Polarization and Extremism Research Innovation Lab. Follow her on Twitter: @whitekesa

You can read diverse opinions from our Board of Contributors and other writers on the Opinion front page, on Twitter @usatodayopinion and in our daily Opinion newsletter. To respond to a column, submit a comment to letters@usatoday.com.

This article originally appeared on USA TODAY: TikTok poses threats like other social media: Make it a priority too