The European Union is formally investigating TikTok’s compliance with the bloc’s Digital Services Act (DSA), the Commission has announced.
Areas the Commission are focusing on in this investigation of TikTok are linked to the protection of minors, advertising transparency, data access for researchers, and the risk management of addictive design and harmful content, in said in a press release.
The DSA is the bloc’s online governance and content moderation rulebook which, since Saturday, has applied broadly to — likely — thousands of platforms and services. However, since last summer, larger platforms, such as TikTok, have faced a set of extra requirements in areas like algorithmic transparency and systemic risk and it’s those rules that the video-sharing platform is now being investigated under.
Today’s move follows several months of information gathering by the Commission, which enforces the DSA rules for larger platforms — including in areas like child protection and disinformation risks.
The EU’s concerns over TikTok’s approach to content governance and safety predate the DSA coming into force on larger platforms.
TikTok was contacted for comment on the EU’s formal investigation. It’s the second such proceeding, after the probe the bloc opened on Elon Musk-owned X (formerly Twitter) in December, also citing a string of concerns.
Penalties for confirmed breaches of the DSA can reach up to 6% of global annual turnover.
In its press release, the Commission says the probe of TikTok’s compliance with DSA obligations in the area of systemic risks will look at “actual or foreseeable negative effects” stemming from the design of its system, including algorithms. The EU is worried TikTok’s UX may “stimulate behavioural addictions and/or create so-called ‘rabbit hole effects’”, as its PR puts it.
“Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalisation processes,” it further writes.
The Commission is also concerned that mitigation measures TikTok has put in place to protect kids from accessing inappropriate content — namely age verification tools — “may not be reasonable, proportionate and effective”.
It will therefore also look at whether TikTok is complying with “DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems”.
Elsewhere, the bloc’s probe will look at whether TikTok is fulfilling the DSA requirement to provide “a searchable and reliable repository” for ads that run on its platform.
Also on transparency, the Commission says its investigation concerns “suspected shortcomings” by TikTok when it comes to providing researchers with access to publicly accessible data on its platform sop they can study systemic risk in the EU — with such data access being mandated by Article 40 of the DSA.
This story is breaking news… refresh for updates