The European Commission this week preliminarily found TikTok in breach of the Digital Services Act for its addictive design features. The regulatory action marks the first time Brussels has directly targeted platform architecture elements including infinite scroll, autoplay, push notifications, and highly personalized recommender systems for their effects on user wellbeing.

According to the Commission's press release issued February 6, 2026, TikTok failed to adequately assess how these addictive features could harm the physical and mental wellbeing of users, including minors and vulnerable adults. The platform now faces potential fines up to 6% of ByteDance's global annual turnover if the preliminary findings are confirmed following investigation proceedings.

Risk assessment failures

The Commission's investigation preliminarily indicates that TikTok disregarded important indicators of compulsive app use during its risk assessment process. TikTok's evaluation ignored the time minors spend on the platform at night, the frequency with which users open the app, and other potential indicators of addictive behavior patterns.

By constantly rewarding users with new content, certain design features of TikTok fuel the urge to keep scrolling and shift users' brains into "autopilot mode," according to the Commission. Scientific research shows this may lead to compulsive behavior and reduce users' self-control. The investigation included analysis of TikTok's risk assessment reports, internal data and documents, TikTok's responses to multiple information requests, a review of extensive scientific research on the topic, and interviews with experts in multiple fields including behavioral addiction.

Executive Vice-President for Tech Sovereignty, Security and Democracy Henna Virkkunen stated that "social media addiction can have detrimental effects on the developing minds of children and teens. The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online."

Ineffective mitigation measures

The Commission preliminarily found that TikTok failed to implement reasonable, proportionate and effective measures to mitigate risks stemming from its addictive design. The current measures on TikTok, particularly the screen time management tools and parental control tools, do not appear to effectively reduce the risks stemming from the platform's addictive design.

The time management tools do not seem effective in enabling users to reduce and control their use of TikTok because they are easy to dismiss and introduce limited friction. Similarly, parental controls may not be effective because they require additional time and skills from parents to introduce the controls, according to the Commission's preliminary findings.

At this stage, the Commission considers that TikTok needs to change the basic design of its service. Suggested modifications include disabling key addictive features such as infinite scroll over time, implementing effective screen time breaks including during the night, and adapting its recommender system. Reuters reported that the EU watchdog's charges focus on TikTok's addictive design which includes features such as infinite scroll, autoplay, push notifications, and its highly personalized recommender system.

Virkkunen told reporters that "we are expecting after that TikTok has to take actions and they have to change the design of their service in Europe to protect our minors." The Commission suggested TikTok disable its infinite scroll over time, implement effective screen time breaks including during the night, and adapt its recommender system.

TikTok's response

A TikTok spokesperson criticized the European Commission charges, stating that "the Commission's preliminary findings present a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings." The regulatory move underscores the European Union's crackdown on Big Tech which has drawn criticism from the U.S. government over censorship and led to threats of tariffs.

TikTok's algorithm, driven by understanding of user interests, has been key to the social media app's global success. The platform's recommender system analyzes user behavior to deliver highly personalized content feeds designed to maximize engagement and time spent on the platform.

Broader DSA enforcement context

The Commission's preliminary findings today are part of its formal proceedings to investigate TikTok's compliance with the Digital Services Act, launched on February 19, 2024. As well as addictive design, this investigation covers the "rabbit hole effect" of TikTok's recommender systems, the risk of minors having an age-inappropriate experience due to misrepresentation of their age, and the platform's obligations to ensure a high level of privacy, safety and security for minors.

The investigation also included access to public data for researchers, for which preliminary findings were adopted in October 2025, and advertising transparency, which was closed through binding commitments in December 2025. Meta Platforms' Facebook and Instagram were charged with DSA breaches in October last year over their so-called dark patterns or deceptive interface designs among others.

EU regulators also asked Snapchat, YouTube, Apple and Google for information on their age verification systems and how they prevent minors from accessing illegal products and harmful material. The step comes as some European countries, including France and Spain, as well as India consider banning social media access for teenagers in a sign of hardening attitudes towards the technology that some say is designed to be addictive.

In December, Australia became the world's first country to block children under 16 from platforms including TikTok, Alphabet's YouTube and Meta's Instagram and Facebook. Virkkunen reiterated that it is up to individual countries to set age limits, rather than centrally from Brussels. "But of course when we look at digital markets, it would be good to have a common approach here," she said, noting the differences between member states on how services are used.

Marketing industry implications

The enforcement action carries significant implications for digital marketers and advertisers who rely on TikTok for reaching audiences. When platforms face potential fines and operational restrictions, advertising ecosystems can experience disruption. Any changes to how TikTok handles content moderation or implements new compliance measures could affect ad delivery, targeting capabilities, and overall campaign performance.

TikTok, despite being newer to the digital advertising landscape, has rapidly become essential for brands targeting younger demographics. The platform's highly personalized recommender system - the same technology now under regulatory scrutiny - forms the foundation of its advertising effectiveness. Modifications to this system to comply with DSA requirements could fundamentally alter how advertisements are delivered and how audiences are targeted on the platform.

The Digital Services Act has been reshaping how platforms operate in the European Union since its implementation, requiring unprecedented levels of transparency and accountability. For marketing professionals utilizing social media platforms, enforcement actions emphasize the importance of understanding platform data handling practices and potential regulatory implications.

Companies must consider how their advertising and content strategies align with evolving privacy regulations and platform compliance measures. The outcome of regulatory investigations may influence how other technology companies approach regulatory transparency and international data governance, particularly regarding accurate representation of technical infrastructure and data processing practices.

EU lawmaker Alexandra Geese praised the EU move on TikTok, stating that "many social media platforms ruthlessly exploit these mechanisms to boost advertising revenue at the expense of the health of children and teenagers. This must come to an end."

Technical requirements under DSA

The Commission's investigation examined whether TikTok complied with multiple Digital Services Act requirements. Article 34 requires very large online platforms to conduct risk assessments that identify, analyze, and assess any systemic risks stemming from the design or functioning of their service and its related systems. Article 35 mandates platforms to put in place reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified.

Article 42 requires platforms to notify the Commission and the Digital Services Coordinator of establishment before deploying functionalities that may have critical impact on their risk profile. The Commission has pursued multiple enforcement actions under the DSA framework. In the second half of 2024, 16 million content removal decisions taken by TikTok and Meta were challenged by users within the EU framework. The success rate for these challenges reached 35 percent, meaning more than one-third of content moderation decisions initially taken by major platforms were deemed unjustified and subsequently reversed.

A core component of the DSA involves mandatory risk assessments for very large online platforms. The legislation requires these entities to evaluate how their algorithmic systems might amplify systemic risks such as electoral manipulation or negative impacts on mental health. Platforms must then implement mitigation measures proportionate to identified risks, with independent audits verifying compliance.

Next steps and procedural rights

TikTok now has the possibility to exercise its right to defense. It may examine the documents in the Commission's investigation files and reply in writing to the Commission's preliminary findings. In parallel, the European Board for Digital Services will be consulted on the findings.

If the Commission's views are ultimately confirmed, the Commission may issue a non-compliance decision, which can trigger a fine proportionate to the nature, gravity, recurrence and duration of the infringement. The penalty could reach up to but not more than 6% of the total worldwide annual turnover of the provider.

These preliminary findings do not prejudge the outcome of the investigation. The Commission emphasized that TikTok can ask to see the Commission's documents and provide a written response before the watchdog issues a decision. Last month, the social media app settled a social media addiction lawsuit ahead of a trial against Meta and YouTube.

TikTok last year settled charges of infringing a DSA requirement to publish an advertisement repository allowing researchers and users to detect scam advertisements. The platform has faced escalating regulatory pressure across multiple jurisdictions regarding data transfers, privacy protections, and content moderation practices.

International regulatory landscape

TikTok has previously faced regulatory challenges in European markets, including fines from Dutch authorities for providing privacy policies only in English to Dutch users, including children. The Federal Trade Commission filed a lawsuit against TikTok and ByteDance for alleged violations of the Children's Online Privacy Protection Act, highlighting global regulatory concerns about the platform's data handling practices.

The Irish Data Protection Commission issued a landmark decision in April 2025, determining that when staff in a third country remotely access personal data of European Economic Area users, that access itself constitutes a transfer under the General Data Protection Regulation. The ruling resulted in a €530 million administrative fine against TikTok Technology Limited.

Privacy advocacy group noyb filed formal complaints in January 2025 against six Chinese firms, including TikTok, alleging unauthorized data transfers to China. The complaints, submitted to authorities in Austria, Belgium, Greece, Italy, and the Netherlands, specifically target the data transfer practices of companies including TikTok, AliExpress, SHEIN, Temu, WeChat, and Xiaomi.

Virkkunen said investigations into other online platforms were advancing well and that decisions are expected in the next weeks and months, without naming any company. Former EU commissioners defended the Digital Markets Act and DSA against accusations of censorship in January 2026, following the Trump administration's decision to bar five European officials from entering the US.

The commentary, authored by Bertrand Badré, Guillaume Klossa, and Margrethe Vestager, argued that these regulations are designed to curb the market dominance of gatekeeper platforms and ensure algorithmic accountability rather than police speech. The enforcement environment continues evolving as authorities adapt to emerging technologies and business models.

Timeline

  • February 19, 2024: European Commission opens formal proceedings against TikTok under the Digital Services Act
  • April 30, 2025: Irish Data Protection Commission issues €530 million fine against TikTok for data transfer violations
  • May 2, 2025: Irish Data Protection Commission announces final decision following inquiry into TikTok's data transfers to China
  • July 12, 2025: Irish DPC commences new inquiry into TikTok over China data storage violations
  • July 16, 2025: Irish High Court allows TikTok to challenge €530 million data protection fine
  • October 24, 2025: Commission preliminarily finds TikTok and Meta in breach of DSA transparency rules
  • December 2025: TikTok closes advertising transparency investigation through binding commitments
  • January 16, 2025: Privacy advocacy group noyb files formal complaints against six Chinese technology companies including TikTok
  • January 26, 2026: European Commission launches formal investigation into X's Grok deployment and extends proceedings on recommender systems
  • February 6, 2026: European Commission preliminarily finds TikTok in breach of Digital Services Act for addictive design features

Summary

Who: The European Commission preliminarily found TikTok in breach of the Digital Services Act, with Executive Vice-President Henna Virkkunen leading the enforcement action against ByteDance's platform for failing to adequately assess and mitigate risks from addictive design features.

What: TikTok faces preliminary DSA breach findings for addictive design features including infinite scroll, autoplay, push notifications, and highly personalized recommender systems that fuel compulsive behavior and shift users into autopilot mode. The platform failed to implement effective mitigation measures, with screen time management tools and parental controls deemed insufficient to reduce risks. The Commission requires TikTok to change its basic service design, potentially disabling infinite scroll, implementing effective screen time breaks, and adapting its recommender system.

When: The European Commission announced the preliminary findings on February 6, 2026, as part of formal proceedings launched on February 19, 2024. The investigation included analysis of TikTok's risk assessments, internal data, scientific research reviews, and expert interviews in behavioral addiction fields. The preliminary findings follow October 2025 transparency violation findings and December 2025 binding commitments on advertising transparency.

Where: The enforcement action affects TikTok's operations throughout the European Union, where the Digital Services Act has been reshaping platform accountability since February 2024. TikTok must implement design changes to its service in Europe to protect minors and vulnerable users, with the Commission's jurisdiction extending to all 27 EU member states plus Iceland, Norway, and Liechtenstein through the European Economic Area.

Why: The Commission found that TikTok's features constantly reward users with new content, fueling the urge to keep scrolling and shifting brains into autopilot mode, which scientific research shows may lead to compulsive behavior and reduced self-control. TikTok disregarded important indicators of compulsive use including time minors spend on the platform at night and the frequency with which users open the app. The enforcement action aims to protect children and teenagers from social media addiction's detrimental effects on developing minds, with the DSA making platforms responsible for effects on users' physical and mental wellbeing.

Share this article
The link has been copied!