European Union regulators are taking aim at TikTok’s core functionality, accusing the video-sharing platform of employing “addictive design” features that exploit user behavior, particularly among young people. Preliminary charges, announced on , allege that TikTok hasn’t adequately assessed the risks posed by features like autoplay and infinite scroll to the physical and mental wellbeing of its users.
The investigation, spanning two years, is being led by the European Commission, the executive arm of the 27-nation EU, and operates under the framework of the Digital Services Act (DSA). The DSA, a sweeping set of regulations, compels social media companies to proactively address harmful content and protect users, with the threat of substantial fines for non-compliance. This action against TikTok represents a significant test of the DSA’s enforcement power.
The Core of the Complaint: Addictive by Design
At the heart of the EU’s concerns are features designed to maximize user engagement. Autoplay, which automatically begins playing the next video in a feed, and infinite scroll, which continuously loads new content as a user scrolls down, are identified as key contributors to compulsive use. Regulators argue that these features bypass a user’s conscious decision to continue consuming content, effectively putting their brains “into autopilot mode,” as described by the European Commission. This constant stream of stimuli, they contend, can lead to reduced self-control and potentially harmful behavioral patterns.
The EU’s investigation extends beyond these two features, also scrutinizing the impact of push notifications and TikTok’s highly personalized content recommendation system. The algorithm, which learns user preferences and serves up tailored videos, is accused of reinforcing existing interests and potentially creating echo chambers, further contributing to addictive behavior. The Commission alleges TikTok failed to adequately assess how these features could harm the physical and mental wellbeing of its users, including minors and “vulnerable adults.”
Potential Remedies: A Fundamental Redesign?
The European Commission believes TikTok needs to fundamentally alter its service design to address these concerns. Specifically, the Commission is calling for the disabling of key addictive features, the implementation of effective ‘screen time breaks’ – including during nighttime hours – and adjustments to the recommender system. This suggests a move away from the current model of maximizing engagement at all costs, towards a more user-centric approach that prioritizes wellbeing.
The potential consequences for TikTok are significant. A non-compliance decision could result in a fine of up to 6% of ByteDance’s total annual global revenue – a substantial penalty for the platform’s parent company. However, the impact extends beyond financial repercussions. A forced redesign could fundamentally alter TikTok’s appeal and competitive position in the social media landscape.
TikTok’s Response and the Road Ahead
TikTok has vehemently denied the accusations, characterizing the Commission’s findings as “categorically false and entirely meritless.” The company has stated its intention to “take whatever steps are necessary to challenge these findings through every means available.” This sets the stage for a legal battle, where TikTok will likely argue that its features are not inherently harmful and that it already takes steps to protect its users.
TikTok now has an opportunity to respond to the Commission’s findings. This response will be crucial in shaping the outcome of the investigation. The Commission will then assess TikTok’s arguments and determine whether to issue a formal decision. If a non-compliance decision is reached, TikTok will be required to implement the changes outlined by the Commission, or face the aforementioned fines.
Broader Implications for Social Media Regulation
This case is not isolated to TikTok. It represents a broader trend of increasing regulatory scrutiny of social media platforms and their impact on users, particularly children. The EU’s DSA is at the forefront of this movement, setting a precedent for other jurisdictions to follow. Henna Virkkunen, the commission’s executive vice-president for tech sovereignty, security and democracy, emphasized the potential for social media addiction to have “detrimental effects on the developing minds of children and teens,” underscoring the regulatory focus on protecting vulnerable users.
The outcome of this case will likely have ripple effects throughout the industry. Other social media companies may be compelled to proactively address similar concerns about addictive design features, even in the absence of formal regulatory action. The debate over the responsibility of platforms to protect users from the potential harms of their services is likely to intensify, and the DSA’s approach could serve as a model for future legislation.
The EU’s action against TikTok highlights the growing tension between the business models of social media companies – which often rely on maximizing user engagement – and the need to protect users from potential harm. As regulators around the world grapple with these challenges, the future of social media regulation remains uncertain, but one thing is clear: the era of unchecked platform power is coming to an end.
