“`html
What is the Digital Services Act (DSA)?
Table of Contents
The Digital Services Act (DSA) is a landmark European Union law designed to create a safer digital space for users and hold online platforms accountable for illegal and harmful content. It establishes a complete set of rules for all digital services operating in the EU, ranging from social media and online marketplaces to search engines and hosting services. The DSA officially entered into force on November 3, 2022, with most provisions applying from February 17, 2024, for very large online platforms (VLOPs) and very large online search engines (VLOSEs). The European Commission provides detailed details on the DSA.
Key Objectives and scope
The DSA aims to protect fundamental rights online, including freedom of expression, information, and consumer protection. It addresses issues such as illegal content, disinformation, online advertising, and the transparency of online platforms’ algorithms. The law applies to any provider offering digital services within the EU, regardless of where the provider is established. The European Parliament outlines the key changes brought about by the DSA.
The DSA categorizes digital services into different tiers based on their size and risk profile. VLOPs and VLOSEs, those with 45 million or more active users in the EU, face the most stringent obligations. These include conducting risk assessments, implementing mitigation measures, and providing greater transparency to users and authorities. The U.S. Federal Trade Commission offers a summary of the DSA’s requirements.
What are the main obligations under the DSA?
The DSA imposes a wide range of obligations on digital service providers,tailored to their size and role. These obligations fall into several key areas. Platforms must remove illegal content promptly upon receiving a valid notice, and they must establish mechanisms for users to report such content. They also need to be more transparent about their content moderation policies and how they are enforced.
Illegal Content and User Reporting
A core requirement of the DSA is the “notice-and-action” mechanism. Users and “trusted flaggers” (organizations with specialized knowledge) can report illegal content to platforms. Platforms must then act expeditiously on these notices, removing the content if it violates EU law or national law implementing EU law. DLA Piper provides a detailed analysis of the notice-and-action mechanism. The DSA defines illegal content broadly, encompassing hate speech, terrorist content, counterfeit goods, and other unlawful material.
For example, on March 19, 2024, Meta Platforms was requested by the European Commission to provide information regarding its measures to combat the spread of illegal content related to the Russia-Ukraine war. European Commission press release on Meta request.
Transparency and Accountability
The DSA requires VLOPs and VLOSEs to be more transparent about their algorithms and content advice systems. They must explain to users how these systems work and allow them to opt out of personalized recommendations. Platforms must also provide access to data for researchers studying the impact of their services on society. access Now details the algorithmic transparency requirements of the DSA. This increased transparency aims to empower users and enable greater scrutiny of platforms’ practices.
On February 23, 2024, the European Commission formally requested data from X (formerly twitter) regarding its measures to counter the dissemination of illegal content and disinformation, notably in the context of the upcoming European Parliament elections. european Commission press release on X data request.
What are the potential consequences of non-compliance?
Non-compliance with the DSA can result in
