The EU’s Digital Services Act (DSA) represents a profound regulatory shift for digital intermediaries. Designed to create a safer, more transparent, and accountable online environment, the DSA imposes a layered regime in which obligations escalate for online platforms, culminating in stringent duties for very large online platforms / search engines (VLOPs/VLOSE). For platforms operating in or targeting the EU market, compliance is not simply a matter of regulatory formality: it is now a central element of operational and legal risk management. This article outlines the practical implications of the DSA for these providers, focusing on the main operational risks and the litigation trends.
- New operational risks under the DSA framework
The DSA introduces a suite of obligations that will affect the very core of how digital platforms design, operate, and monetise their services. Among the most significant are new transparency requirements for advertising, prohibition of dark patterns, disclosure obligations concerning recommender systems, and enhanced protection of minors online.
Platforms must now ensure that users can clearly identify the entity behind each advertisement and understand why they are being targeted. This includes the obligation to display “meaningful information” about the parameters used for targeting and the possibility to modify them. These rules significantly affect ad-driven business models and require the redesign of interfaces and internal documentation.
Join The European Business Briefing
New subscribers this quarter are entered into a draw to win a Rolex Submariner. Join 40,000+ founders, investors and executives who read EBM every day.
SubscribeThe prohibition of dark patterns — manipulative design practices that are used to influence consent or subscription decision — further constrains product design and marketing strategies. The European Commission and national Digital Services Coordinators are already signalling that enforcement in this area will be vigorous, particularly where dark patterns are used to influence consent or subscription decisions (e.g. see the Commission’s preliminary findings against X on July 2024).
Transparency obligations also extend to recommender systems: platforms must explain in “plain and intelligible language” the main parameters determining how information is presented, as well as offer user’s meaningful options to influence or modify these settings. This requirement, while aiming to enhance user autonomy, forces platforms to revisit long-standing algorithms and potentially disclose commercially sensitive information.
Finally, the DSA imposes increased obligations to protect minors, including by prohibiting the use of targeted advertising based on profiling when the recipient is a minor. Platforms accessible to children must now perform specific risk assessments and adapt their design and moderation systems accordingly — a step that may carry both technical and compliance challenges.
- Marketplace-specific obligations and risks
Marketplaces face particularly stringent rules under the DSA. They are now subject to enhanced traceability obligations for professional traders, requiring them to collect and verify essential information before allowing traders to offer products or services through their interface.This includes verifying identity details, contact information, and relevant trade or registration numbers.
In addition, marketplaces must inform consumers clearly about the identity of the trader and whether the seller is acting in a professional or private capacity — a distinction that determines the applicability of EU/national consumer protection laws. The DSA also imposes design requirements to ensure that such information is accessible before a transaction takes place.
These obligations create new operational risks. Platforms may face liability if they fail to obtain accurate trader information or if the design of their interface is deemed to obscure key information. In practice, this will require the implementation of robust onboarding processes, verification systems, and continuous monitoring mechanisms.
- The VLOP/VLOSE regime and systemic risk management
The DSA establishes an additional layer of obligations for very large online platforms (VLOPs) and search engines (VLOSE) — those reaching more than 45 million monthly active users in the EU. These entities must assess, mitigate, and report on systemic risks arising from the design or use of their services, including the dissemination of illegal content, negative effects on fundamental rights, electoral processes, and public security.
They must conduct annual independent audits of their compliance with the DSA and provide regulators with access to internal data for supervisory purposes. The European Commission directly oversees these platforms.
It is therefore unsurprising that the first DSA-related investigations and disputes have focused on VLOPs/VLOSEs. The Commission has already opened several formal proceedings, including against X, Meta, and TikTok, over alleged breaches of systemic risk and transparency obligations. These actions demonstrate the EU’s determination to use the DSA’s enforcement mechanisms swiftly and visibly.
Faced with this scrutiny, some platforms have sought to challenge their classification as VLOPs, arguing that the designation process is opaque or disproportionate. However, the EU General Court’s recent judgment from 3 September 2025, rejecting Zalando’s appeal against its designation confirms the Commission’s broad latitude in defining which platforms qualify as “very large.”
- Moderation of illegal content: obligations and operational impact
One of the DSA’s most practical dimensions concerns content moderation. Building on the liability framework of the 2000 E-Commerce Directive, the DSA codifies and refines the “notice and action” mechanism: platforms remain exempt from liability for illegal content uploaded by users, provided they act promptly to remove or disable access upon obtaining actual knowledge of its illegality.
To ensure transparency and accountability, platforms must establish user-friendly notice mechanisms, handle complaints diligently, and provide reasoned explanations for removal or refusal to remove content. They are also required to implement safeguards against misuse, such as repetitive or malicious notifications.
These rules will have tangible operational implications. Platforms must train moderation teams, invest in automated detection systems that respect due process guarantees, and maintain extensive documentation to demonstrate compliance in case of investigation or dispute.
While the moderation of illegal content has long been a source of litigation under the E-Commerce Directive, the DSA’s more detailed framework is likely to generate an increased volume of disputes. Questions around the definition of “illegal content,” the proportionality of removal decisions, and the adequacy of procedural safeguards will continue to be tested before national and European courts.
- Emerging litigation trends
The DSA introduces a new right for users to seek compensation for damages resulting from a platform’s infringement of its obligations. This provision opens the door to direct civil litigation across the EU and could trigger a new wave of coordinated claims by users, consumers, or traders. For digital platforms, this represents a substantial shift: compliance failures may now translate not only into regulatory fines but also into private damages claims.
As enforcement of the DSA progresses, litigation will play a key role in defining what platform accountability and user rights look like in practice across Europe. Court decisions will shape how concepts such as “systemic risk” and algorithmic disclosure are interpreted and applied.
For digital platforms, the DSA represents more than a compliance checklist. It requires proactive legal and operational strategies, strong internal coordination, and ongoing engagement with regulators. Companies which incorporate DSA requirements into their governance frameworks and product design will be better equipped to manage risks, demonstrate accountability, and even potentially influence how the new regulatory landscape evolves.
Ela Barda, partner, Signature Litigation Paris
Philippine Chapot, legal intern, Signature Litigation Paris





































