Feyenoord – Fortuna Sittard
What is the Digital Services Act?
Table of Contents
The Digital Services Act (DSA) is a European Union law that establishes a comprehensive set of new obligations for online platforms and services to protect basic rights online, specifically addressing illegal and harmful content.
Enacted on October 20, 2022, and fully applicable as of February 17, 2024, the DSA aims to create a safer digital space where illegal content can be quickly removed, and users are empowered to report concerns.It applies to a wide range of online services, from social media platforms and online marketplaces to search engines and hosting services. The law categorizes services based on their size and impact, with Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) facing the moast stringent requirements.
For example, VLOPs, defined as platforms with 45 million or more monthly active users in the EU, like Meta (Facebook, Instagram) and ByteDance (TikTok), are required to conduct risk assessments, implement mitigation measures, and provide greater transparency regarding their algorithms and content moderation practices. As of January 26, 2026, these companies have faced fines of up to 6% of their global annual revenue for non-compliance, as outlined in the European Commission’s official DSA webpage.
What problems does the DSA address?
The DSA addresses several key problems related to online safety and accountability, including the spread of illegal content, disinformation, and harmful products.
Prior to the DSA, the EU’s existing e-Commerce Directive (2000/31/EC) provided a safe harbor for online intermediaries, shielding them from liability for user-generated content provided that they acted “expeditiously” to remove illegal material upon notification. However, this system was criticized for being slow, inconsistent, and ineffective in addressing the rapid spread of harmful content online. The DSA aims to modernize this framework by imposing proactive obligations on platforms to prevent the dissemination of illegal content and protect users from harm.
A 2023 report by the European Digital Media Observatory (EDMO) found that disinformation campaigns targeting the 2024 European Parliament elections increased by 35% compared to the 2019 elections, highlighting the urgency of addressing online manipulation. The DSA directly tackles this issue by requiring platforms to implement measures to counter disinformation and promote media literacy, as detailed in EDMO’s 2023 Disinformation Trends Report.
What are the key obligations under the DSA?
The DSA introduces a tiered system of obligations based on the size and nature of the online service. Key obligations include enhanced due diligence, transparency requirements, and user redress mechanisms.
All online intermediaries are required to implement a notice-and-action mechanism, allowing users to report illegal content. Online platforms must also provide clear terms and conditions and explain their content moderation policies. VLOPs and VLOSEs face more extensive obligations,including conducting risk assessments related to systemic risks such as the spread of illegal content,disinformation,and negative impacts on fundamental rights. They must also implement mitigation measures to address these risks, provide transparency regarding their algorithms, and allow independent auditing of their systems.
On March 19, 2024, the European Commission designated 22 VLOPs and VLOSEs, including Amazon, Apple, Google, and X (formerly Twitter), subject to the most stringent requirements under the DSA. These companies were given a deadline of four months to comply with the new obligations, as announced in the european Commission’s press release.Failure to comply can result in significant fines.
How does the DSA impact users?
The DSA aims to empower users by giving them more control over their online experience and providing them with effective redress mechanisms.
Users now have the right to challenge content moderation decisions made by platforms and to seek redress from national authorities if they believe their rights have been violated. The DSA also prohibits platforms from using dark patterns – deceptive design practices that manipulate users into making unintended choices. Furthermore,the law requires platforms to provide users with clear and accessible details about why content has been removed or restricted.
As of December 2025, over 10,000 complaints related to DSA violations had been filed with national Digital Services Coordinators across the EU, according to data released by the European Commission on January 15, 2026. These complaints primarily concerned issues such as illegal content,hate speech,and unfair content moderation practices,demonstrating the growing awareness and utilization of the DSA’s user rights,as reported by Euractiv.
