The European Commission has accused TikTok of deploying design features intentionally engineered to be addictive, potentially harming the mental and physical wellbeing of its users, particularly minors and vulnerable adults. The preliminary findings, stemming from a two-year investigation under the EU’s Digital Services Act (DSA), represent a significant escalation in regulatory scrutiny of the popular video-sharing platform.
The Commission’s investigation identified several key features as contributing to the addictive design, including infinite scroll, autoplay, push notifications, and a highly personalized recommender system. Officials argue that these features are not simply designed to enhance user engagement, but to maximize time spent on the app, often at the expense of user wellbeing. The recommender system, powered by artificial intelligence, was specifically criticized for prioritizing engagement over potential negative experiences, particularly for teenage users.
According to the Commission, TikTok’s design constantly “rewards” users with new content, creating a cycle that fuels compulsive scrolling and induces an “autopilot mode” state. This, they contend, can lead to reduced self-control and potentially addictive behaviors. The investigation also highlighted concerns about TikTok’s failure to adequately assess the risks associated with these features, specifically regarding the amount of time minors spend on the platform, particularly late at night, and the frequency with which users open the app.
The Commission’s findings suggest that TikTok’s existing screen time management and parental control tools are insufficient to mitigate the risks posed by its addictive design. These tools, the Commission argues, are easily dismissed and offer limited friction, failing to effectively enable users to control their usage. The investigation also examined the “rabbit hole effect” – the tendency for users to become engrossed in a continuous stream of content – and the risks associated with minors misrepresenting their age to access inappropriate material.
The proposed remedies outlined by the Commission include disabling key addictive features like infinite scroll over time, implementing effective screen-time breaks, including during nighttime hours, and adapting the recommender system to prioritize user wellbeing. The Commission’s statement emphasized the need for a fundamental change to the basic design of TikTok’s service.
If the preliminary findings are upheld, TikTok could face a substantial financial penalty, potentially reaching up to 6% of its global annual turnover. The Commission based its conclusions on an analysis of TikTok’s risk assessment reports, internal data, responses to information requests, and a review of recent scientific research, including interviews with experts in behavioral addiction. The investigation also considered reports from France, Denmark, and Poland detailing excessive TikTok usage among young people.
TikTok has vehemently denied the accusations, characterizing the Commission’s findings as a “categorically false and entirely meritless depiction” of the platform. The company maintains that it offers numerous tools to help users manage their time on the app, including daily screen time limits, sleep reminders, interactive meditation experiences, and the ability to silence notifications. TikTok also points to a 2025 UNICEF study which found “no clear evidence that screen time directly harms children’s mental health,” arguing that the focus should be on empowering users to make informed choices.
The company highlighted features specifically designed for younger users, such as automatically setting screen time limits to 60 minutes for users aged 13-17 and offering “Family Pairing” controls that allow parents to restrict access and usage. TikTok also noted that users aged 13-15 do not receive push notifications after 9 pm, and those aged 16-17 have notifications disabled from 10 pm. The platform implements guided meditation experiences for users who continue to use the app late at night.
The EU’s action reflects a growing global concern about the potential harms of social media, particularly on young people. The Digital Services Act, which came into effect in February 2024, aims to create a safer digital space by holding online platforms accountable for the content they host and the impact they have on users. The DSA requires platforms to assess and mitigate systemic risks, including those related to addictive design and the spread of harmful content.
The Taoiseach (Irish Prime Minister) Micheál Martin described the Commission’s findings as “very serious,” and suggested they vindicated the Irish government’s decision to restrict smartphone use in schools through the implementation of phone pouches. This move, initially criticized by the opposition, is now seen as a proactive step to protect young people’s mental health.
The Commission emphasized that the DSA is not intended as a censorship tool, but rather as a “due diligence” mechanism to address systemic risks while upholding freedom of expression. Officials stated that the investigation revealed “serious shortcomings” in TikTok’s risk assessment processes, noting that the company disregarded relevant evidence regarding excessive platform usage. The core principle, they explained, is that platforms must take into account the “best scientific evidence and expert knowledge” when assessing and mitigating risks.
The case highlights the tension between the business models of social media companies, which often rely on maximizing user engagement, and the need to protect vulnerable users from potential harm. The outcome of this investigation could set a precedent for how other platforms are regulated in the EU and beyond, potentially leading to significant changes in the design and operation of social media services.
