EU Commission opens probe into TikTok over child protection concerns

The logo of the TikTok platform is displayed on a smartphone.  The European Commission on 19 February opened an investigation into TikTok over suspected breaches of European Union rules on child protection and advertising transparency. Monika Skolimowska/dpa
The logo of the TikTok platform is displayed on a smartphone. The European Commission on 19 February opened an investigation into TikTok over suspected breaches of European Union rules on child protection and advertising transparency. Monika Skolimowska/dpa

The European Commission on Monday opened an investigation into TikTok over suspected breaches of European Union rules on child protection and advertising transparency.

The commission is investigating whether the video-sharing platform's algorithms may be addictive or create "rabbit hole effects." The EU executive is also looking into the privacy settings TikTok provides for minors, as well as whether it is meeting obligations to provide a searchable repository of the ads shown on its platform.

The probe concerns potential breaches of the Digital Services Act (DSA), a relatively recent law governing online platforms. The DSA requires large platforms to manage various risks, including to mental well-being and the rights of children. It also forbids using minors' personal data to show them targeted ads.

If TikTok is ultimately found to have breached the DSA's risk mitigation rules, it could face fines as high as 6% of its global annual revenue.

The commission said it was investigating whether TikTok was adequately managing the risk of "actual or foreseeable negative effects" of TikTok's algorithms "that may stimulate behavioural addictions and/or create so-called ‘rabbit hole effects.'"

The EU executive will also look into whether "age verification tools used by TikTok to prevent access by minors to inappropriate content, may not be reasonable, proportionate and effective."

Also part of the investigation is whether TikTok's default privacy settings for minors comply with the DSA.

Child protection is just one aspect of the DSA's broad-ranging obligations. The regulation also requires large platforms to publish searchable repositories of the advertisements they show to users. The commission is also investigating whether TikTok is complying with this obligation.

Finally, the investigation will look into "suspected shortcomings in giving researchers access to TikTok's publicly accessible data," another DSA obligation.

While some of the DSA's rules apply to all platforms, the most onerous requirements — particularly those concerning risk mitigation — only apply to platforms designated by the commission as Very Large Online Platforms (VLOPs).

Platforms are classed as VLOPs if they have more than 45 million monthly active users in the EU, or "active recipients of the service" as they're called in the law. The DSA requires platforms to submit their user numbers to the commission.

The commission listed TikTok as a VLOP in April on the grounds that it had 135.9 million monthly active users in the EU.

TikTok is currently appealing in the EU General Court a €345 million ($372 million) fine issued by the Irish Data Protection Commissioner over its handling of the personal data of minors.