Author:
Nelson Chen, University of Sydney, Australia
Editor:
Wenjia Tang, University of Sydney, Australia
In December 2025, the European Commission fined X €120 million for violating the Digital Services Act’s transparency rules. X’s breaches included the deceptive design of its blue checkmark, the lack of transparency of its advertising repository, and its failure to provide legally mandated data access to researchers. In targeting X’s core EU operations and entering formal enforcement, the fine has unleashed the kind of political and corporate uproar that comes when accountability begins to bite.
The backlash is being misread as evidence of European regulatory overreach. But, this is exactly what happens when enforcement starts to work as intended.

Backlash is not a sign of DSA failure. It is a predictable part of deterrence. The DSA’s transparency obligations were geared towards forcing platforms to change their internal processes, from how they design features, how they document decisions, to how they report risks. Regulation is not in place to keep firms happy. From studying the ways public authorities convert political pressure into actual compliance behaviours, the findings has seen repeatedly that resistance often tends to spike just before genuine adjustment.
What Elon Musk calls a “censorship regime” is a legal framework with escalating investigative and sanctioning powers ranging from requests for information, audits, binding commitments, fines, and temporary suspension of services if necessary. Since X designed systems to mislead users, withheld advertising transparency, and shut out researchers, its violations sit well within that structure.
Crucially, we already know that the DSA mechanism works when applied consistently. In October 2025, the Commission found TikTok and Meta in breach of similar transparency rules involving data access for researchers, notice and action mechanisms, and content moderation appeals. Both companies were required to issue remedial plans. Within weeks, TikTok committed to retaining all advertising content, including linked pages, and pledged to publish criteria for its distribution in a usable repository. These are not symbolic gestures. Instead, they are operational commitments backed by legally enforceable deadlines.
Enforcing user‑rights provisions is steadily changing platform behaviour. In the second half of 2024, TikTok and Meta users challenged 16 million content-removal decisions through the DSA’s framework. A 35% success rate for these challenges means that one-third of removal decisions were overturned, an extraordinary correction rate which would not exist without the enforcement of transparency rules. Once more, backlash from platforms did not stop compliance. It preceded it.
There is still real work to be done. Transparency tools must be functional and accurate. Meta’s abandoned CrowdTangle or TikTok’s Virtual Compute Environment prove how easily platforms can stall or dilute obligations to the DSA. Europe’s regulators must prioritise independent auditing capacity, clear standards for comparable reporting, and compliance scorecards that force platforms to explain discrepancies.
But none of these shortcomings justify retreat. Political heat around X’s fine is not an indictment on the DSA. Rather, it is evidence that accountability is at last landing where it should. Europe cannot allow performative outrage, diplomatic noise, or Big Tech defensiveness to dictate the tempo of enforcement.
The noise will fade. The compliance will remain if regulators hold their nerve.
Know more about our authors:
Dr Nelson Chen is an editor at Policy & Internet, and he works at University of Sydney right now.