Is TikTok Just as Bad as Facebook for Spreading Misinformation?

·4 min read

Facebook may be getting all the blame lately when it comes to social media companies spreading misinformation, but TikTok is far from immune. New investigations and research show that TikTok’s surge in popularity has come with similar growing pains over safety issues and content moderation.

On Tuesday, executives from TikTok along with Snap and YouTube, will testify before a U.S. Senate committee on consumer protection and product safety to defend how the platforms are protecting children from harmful content online. The hearing comes as Congress continues to explore cracking down on Big Tech companies over online protections and antitrust regulation.

“The company behind TikTok is an especially egregious offender, both because they make the personal information of all TikTok users available to the Chinese Communist Party, and because their app pushes sexually explicit and drug-related content onto children,” Sen. Marsha Blackburn (R-Tennessee) said in her email newsletter.

TikTok, which is owned by the Beijing-based Bytedance, was forced earlier this month to answer to regulators about its role in helping extremists recruit and organize rioters for the Jan. 6 assault on the U.S. Capitol. Additionally, researchers have revealed that the app is responsible for spreading anti-vaccination messages, white supremacy content and other dangerous and violent posts involving Mexican drug cartels.

“All social media platforms will eventually have to tackle the challenge of content moderation — we saw this with new platforms such as Clubhouse as well,” Stephanie Chan, analyst at Sensor Tower, told TheWrap.

Representatives for TikTok declined repeated requests for an interview.

Much like its social media rivals, TikTok’s algorithm makes the app addictive as it learns your interests to drive more and more engagement — TikTok just does it a lot faster than other apps. When The Wall Street Journal investigated TikTok’s algorithm earlier this year by creating more than a hundred bots, the app was able to learn those dummy accounts’ vulnerabilities and serve up tailored content in under one hour. Merely tracking how long users lingered or rewatched a video was enough for the algorithm to do its job.

This month, TikTok’s Q2 enforcement report showed the platform removed a total of 81 million videos globally that were in violation of its community guidelines or terms. TikTok noted that this accounts for less than 1% of all videos uploaded to the app. Of these, some 33 million banned videos (41.3%) were flagged for “minor safety” — harm, endangerment, abuse or exploitation of minors. Another 20.9% were removed for illegal activities and regulated goods; 14% of videos were removed for adult nudity and sexual activities.

In Q2, TikTok removed 27,518 videos for COVID disinformation. Of those, 83% were removed before they were reported to TikTok, some of it using the platform’s automatic detection systems. The company said it also uses human content moderators and relies on third-party fact-checking services.

Nonetheless, some questionable content falls through the cracks. An August report from media firm NewsGuard found that COVID-related misinformation reached eight out of nine child participants within their first 35 minutes on the app. Two-thirds of participants, aged 9 to 17, saw incorrect information about COVID vaccines. Children under 13 are not allowed on TikTok, but many often find workarounds by lying about their age.

Kesa White, researcher at American University’s Polarization and Extremism Research Innovation Lab, has been studying the harmful effects of TikTok for the past two years, documenting disturbing content ranging from children pretending to be Ku Klux Klan members to others doing the Nazi salute. She said many of them posting and engaging in this type of content were in their early teens.

“It was very frightening hearing these children talking about this,” White told TheWrap. “(Children) are now becoming political commentators and talking about why they are not getting vaccinated. Children don’t know right from wrong at this stage, and there is so much information we still don’t know about COVID.”

It’s a reminder that harmful content can spread quickly just as dance challenges and lip sync videos go viral on TikTok. This may be happening more frequently on TikTok as people sometimes take these short clips less seriously, White added. “TikTok is so widely popular with younger demographics for dance challenges, and people come for different reasons. The videos are so short. You don’t need professional equipment. It’s a fairly easy mechanism.”

This year, TikTok is on track to beat its social media competitors in attracting more users and gaining more engagement, especially among Gen Z users born between 1997 and 2012. Acquired by Chinese tech company ByteDance in 2017, it has already surpassed Facebook in usage time despite being launched five years ago, according to eMarketer. In September, TikTok hit 1 billion monthly users; 17-year-old Facebook reported 3.5 billion monthly users in Q2.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting