- Litigation Finance Insider
- Posts
- The Digital Services Act: A new era of platform accountability in Europe
The Digital Services Act: A new era of platform accountability in Europe

With the advent of the EU’s Digital Services Act (DSA), digital platforms operating in the European market face a generational regulatory shift.
For the consumer, the new rules should make the online environment safer, more transparent and accountable. Those controlling major online platforms and search engines will, however, be confronted by a layered regime with more stringent duties imposed by a strict and enforceable legal framework. Looking ahead, compliance will no longer be a simple matter of regulatory formality; the DSA will be a central element of operational and legal risk management with major practical implications.
Among the most significant obligations in a suite of requirements imposed by the DSA are new transparency requirements for advertising, disclosure obligations concerning recommender systems, prohibition of dark patterns and enhanced protection of minors online.
So, firstly, platforms must now ensure that users can clearly identify the entity behind each advertisement and understand why they are being targeted. This includes the obligation to display “meaningful information” about the parameters used for targeting as well as the possibility to modify them. For ad-driven business models, these rules require the redesign of interfaces and internal documentation.
Manipulative design practices that are used to influence consent or subscription decision, also known as dark patterns, are now prohibited, inevitably constraining product design and marketing strategies. Early signals from enforcement officials indicate that enforcement of these regulations will be vigorous.
Another profound shift is in the obligation platforms now have to explain in “plain and intelligible language” the main parameters determining how information is presented. Users will additionally need to have meaningful options to modify these settings. This requirement, while aiming to enhance user autonomy, forces platforms to revisit long-standing algorithms and potentially disclose commercially sensitive information.
Finally, perhaps the most talked-about aspect of the DSA is the manner in which it prohibits the use of targeted advertising based on profiling when the recipient is a child. Platforms accessible to minors must now perform specific risk assessments and adapt their systems accordingly, which may carry both technical and compliance challenges.
The measures enacted by the DSA show that marketplaces face particularly stringent checks under the DSA, such as enhanced traceability obligations which require them to verify essential information about professional traders such as their identity details before allowing them to sell products through their interface. Additionally, marketplaces must inform consumers about whether a seller is acting in a professional or private capacity in a clear, accessible way before a transaction takes place.
If platforms fail to obtain accurate trader information or obscure those details to the consumer, there is a risk they may face liability. In practice, this will require robust onboarding processes, verification systems, and continuous monitoring mechanisms.
Platforms reaching more than 45 million monthly active users in the EU face an extra layer of obligations. These entities must assess, mitigate, and report on systemic risks arising from the design or use of their services, including the dissemination of illegal content, negative effects on fundamental rights, electoral processes, and public security. To ensure this, they must conduct annual independent compliance audits with the DSA and provide regulators with access to internal data for supervisory purposes.
As one would expect, the first DSA-related investigations and disputes have focused on the largest online platforms, with the European Commission already opening formal proceedings against X, Meta, and TikTok over alleged breaches of systemic risk and transparency obligations. These actions demonstrate the EU’s determination to use the DSA’s enforcement mechanisms swiftly and visibly.
Moderation of illegal content uploaded by users is one of the DSA’s most practical dimensions. It modernises and strengthens an existing directive to state that platforms remain exempt from liability provided they act promptly to remove or disable access upon obtaining actual knowledge of its illegality. This more detailed framework is likely to generate an increased volume of disputes, with questions around the definition of “illegal content,” the proportionality of removal decisions, and the adequacy of safeguards likely to be tested in court.
Litigation will play a key role in defining what platform accountability and user rights look like in practice across Europe. Court decisions will shape how concepts such as “systemic risk” and algorithmic disclosure are interpreted and applied. For example, the DSA introduces a new right for users to seek compensation for damages resulting from a platform’s infringement of its obligations. This provision opens the door to direct civil litigation and could trigger a new wave of coordinated claims by users, consumers, or traders.
For digital platforms, compliance failures may now translate not only into regulatory fines but also into private damages claims. Inevitably, the DSA represents more than a compliance checklist for such companies. It will require proactive legal and operational strategies, strong internal coordination, and ongoing engagement with regulators. Companies which incorporate DSA requirements into their governance frameworks and product design will be better equipped to manage risks and even potentially influence how the new regulatory landscape evolves.
Ela Barda, Signature Litigation