EU Commission Signals Potential DSA Breach Over TikTok’s Addictive Design

The European Commission has issued preliminary findings indicating that TikTok may be in breach of the Digital Services Act (DSA) due to the platform’s addictive design features. The investigation focuses on features the Commission considers potentially addictive, including infinite scrolling, autoplay, push notifications, and the platform’s highly personalised recommender system, and whether TikTok assessed and mitigated the risks these features pose to users.

While the findings are not final, they signal a growing regulatory emphasis on platform design choices and their impact on user wellbeing, particularly for minors and vulnerable individuals.

Background: the DSA 

The DSA establishes a comprehensive regulatory framework aimed at ensuring a safer and more transparent online environment across the European Union. Among its core objectives are: reducing systemic risks arising from the design and functioning of online platforms, protecting users’ fundamental rights, and imposing heightened obligations on “very large online platforms”, which includes TikTok. 

Under the DSA, very large online platforms are required to identify, analyse and mitigate systemic risks stemming from their services.

Risk assessment concerns

The Commission points to behavioural science research suggesting that certain interface mechanics, such as continuous content feeds, may encourage compulsive engagement. By repeatedly presenting users with new content, these features may foster habitual scrolling behaviour, reduce users’ ability to disengage and shift users into an “autopilot mode”, constantly rewarding them with new content and diminishing their ability to self-regulate. Over time, this may contribute to compulsive behaviour, reduced self-control, sleep disruption and other negative impacts.

According to the Commission’s preliminary assessment, TikTok did not sufficiently evaluate how these design features may negatively affect users’ physical and mental wellbeing. Particular concern was raised regarding indicators of compulsive use, which were found in TikTok’s design but not adequately considered in TikTok’s risk analysis.

Risk mitigation measures under scrutiny

As noted above, the DSA requires platforms to implement reasonable, proportionate, and effective mitigation measures, and the Commission’s investigation therefore also examined whether TikTok implemented effective safeguards to address identified harms.

The preliminary view is that TikTok has fallen short in this respect, as the existing measures, such as screen-time management tools and parental controls, may fall short of being ‘reasonable, proportionate, and effective’, as they can be easily bypassed or dismissed. Similarly, parental control features may require a level of effort or digital literacy that reduces their practical effectiveness.

At this stage, regulators have indicated that more fundamental design adjustments may be necessary. Potential changes referenced include disabling or modifying infinite scroll mechanics, implementing more robust and enforced screen-time breaks (including overnight protections), and revisiting recommendation system dynamics.

Regulatory significance

These preliminary conclusions form part of an in-depth investigation involving review of TikTok’s internal documentation, risk assessment reports, responses to regulatory inquiries and additional consultation. The Commission emphasises that its findings do not prejudge the final outcome of the proceedings.

In any case, the investigation illustrates how regulators are increasingly scrutinising platform architecture, not just content moderation, under the DSA. If confirmed, the findings could set an important precedent for how digital service providers design engagement features, especially where vulnerable users are concerned.

Implications for other digital platforms

The investigation underscores compliance lessons for platforms operating in the EU, including: 

  • Risk assessments must meaningfully address behavioural and psychological impacts of platform design

  • Safeguards must be demonstrably effective, not merely available

  • Protection of minors remains a central regulatory priority, meaning that platforms that allow their presence are expected to follow a higher level of compliance  

  • Interface and recommender system design may attract the same scrutiny as content policies, or T&Cs

Next steps

TikTok will now have the opportunity to formally respond to the Commission’s preliminary findings before any final decision is reached. This procedural phase is a critical component of EU regulatory enforcement, ensuring that the company can present legal arguments, factual clarifications, and proposed measures.

If the Commission ultimately confirms its preliminary position, it has a range of enforcement tools available under the Digital Services Act. These may include binding orders requiring TikTok to modify elements of its platform design to address identified risks, particularly those linked to addictive or compulsive user behaviour. Structural changes could involve redesigning engagement features, strengthening safeguards for minors, or implementing more robust risk mitigation systems. Financial penalties are also a possibility. Under the Digital Services Act, non-compliant platforms may face fines of up to 6% of their total worldwide annual turnover.

Businesses operating in the digital ecosystem should closely monitor developments, as the regulatory interpretation emerging from this case may influence future compliance standards across the sector. If you have any questions, please get in touch to arrange a free initial consultation.

Image by Freepik.

Meet your legal counsel

Author picture

Isadora Werneck

Partner
Isadora is a Partner at Logan & Partners, focusing on the complex landscape of information technology and consumer law.

Send Email

Get in Touch

Whether you’re seeking legal advice for your business, need support with international contracts, or have a question about our services, we’re here to help.