EU Fines X €120 Million Over Transparency Violations: Censorship or Consumer Protection?

EU Fines X €120 Million Over Transparency Violations: Censorship or Consumer Protection?
Photo by Julian / Unsplash

December 5, 2025

The European Union has imposed a €120 million fine on Elon Musk's social media platform X (formerly Twitter), marking the first penalty under the bloc's Digital Services Act since it came into force. The decision has ignited fierce debate about whether the EU is protecting consumers or engaging in what critics call an assault on free speech.

Related: Digital Compliance Alert: UK Online Safety Act and EU Digital Services Act Cross-Border Impact Analysis

The Violations: What the EU Found

The European Commission's investigation, launched in December 2023, identified three specific violations of the Digital Services Act's transparency requirements:

1. Deceptive Blue Checkmark System (€45 million)

After Musk acquired Twitter in 2022, the platform transformed its verification system from authenticating notable accounts to a subscription service where anyone could obtain a blue checkmark for $8 monthly. The EU determined this change makes it difficult for users to distinguish authentic accounts from potential impersonators and scammers.

EU Tech Chief Henna Virkkunen explained the concern: "The system itself makes it difficult to understand what it really is. Anyone can pay to obtain the 'verified' status without the company meaningfully verifying who is behind the account."

2. Advertising Archive Opacity (€35 million)

The Commission found X's advertising archives failed to provide adequate transparency about how targeted advertising works. Users could not understand how ads were being targeted to them or the parameters used for ad delivery—a violation of DSA transparency obligations designed to help users understand and control how their data influences what they see.

3. Blocking Researcher Access (€40 million)

X placed "unnecessary barriers" preventing researchers from accessing public data on the platform. Under Article 40 of the DSA, platforms must grant researchers access to public data to detect, identify, and understand systemic risks in the European Union. This restriction prevents independent analysis of information manipulation, illegal content spread, and other potential harms.

The EU's Justification: Not About Content, But Transparency

EU officials have firmly rejected characterizations that this fine represents censorship. Virkkunen directly addressed the criticism: "This decision is about the transparency of X and nothing to do with censorship."

The Digital Services Act does not dictate what content platforms can or cannot host. Instead, it requires platforms to be transparent about:

  • How their systems work
  • Who is behind accounts
  • How advertising targets users
  • What data they collect and use
  • How they moderate content

The law explicitly maintains the EU's ban on general content monitoring, precisely to protect free speech. Platforms are not required to proactively scan for illegal content—they must simply be transparent about their operations and respond appropriately when issues arise.

The Free Speech Counterargument

Critics, including US Vice President JD Vance, have characterized EU digital regulation as censorship through bureaucratic means. Vance warned against "attacking" US firms through what he called regulatory "suffocation."

The Trump administration's national security strategy, released the same day as the fine, explicitly urged Europe to "abandon its failed focus on regulatory suffocation."

For deeper analysis of these concerns, see: Global Digital Compliance Crisis: How EU/UK Regulations Are Reshaping US Business Operations and AI Content Moderation

This perspective argues that:

1. Transparency Requirements Create Chilling Effects

Forcing platforms to explain their algorithmic systems and moderation decisions may discourage innovation and create legal liability that leads to over-moderation.

2. EU Overreach

The extraterritorial application of EU law to US companies represents regulatory imperialism, where European values are imposed on global platforms.

3. Selective Enforcement

Critics suggest the EU disproportionately targets platforms that don't align with European political preferences, particularly those allowing broader speech than EU authorities prefer.

4. Who Defines "Transparency"?

Requirements to make algorithmic systems "transparent" can be subjective, giving regulators broad discretion to penalize platforms for technical choices rather than actual harms.

A comprehensive counter-proposal: The Internet Bill of Rights: A Framework for Digital Freedom in the Age of Censorship

The Technical Reality: What DSA Actually Requires

Understanding the actual requirements helps clarify whether this is about speech or safety:

For technical implementation details, see: Navigating the Digital Crossroads: EDPB's Groundbreaking Guidelines on DSA-GDPR Interplay

Blue Checkmarks: The DSA doesn't prohibit paid verification. It requires platforms to clearly explain what verification means. If a checkmark suggests authentication, the platform must actually authenticate. If it's just a paid subscription badge, that must be clear to users to prevent impersonation scams.

Ad Transparency: Platforms must maintain searchable repositories showing who paid for ads, who was targeted, and what criteria were used. Users should know if they're seeing an ad because of their age, location, political views, or other factors.

Researcher Access: Public data (tweets visible to everyone) must be accessible to vetted researchers studying platform risks. This doesn't grant access to private messages or non-public information.

Previous Context: Why This Matters

X is the first platform to receive a formal non-compliance fine under the DSA, making this a test case for EU enforcement. The investigation began under very different circumstances—before Trump's return to the presidency with Musk as a close advisor.

The timing is politically fraught. The EU delayed finalizing this investigation through mid-2024, with the changing US political landscape clearly weighing on enforcement decisions. The decision to proceed despite potential diplomatic tension signals EU determination to enforce its regulations regardless of US pressure.

The Broader Implications

This case crystallizes fundamental questions about digital governance:

Context on the broader regulatory landscape: Briefing on the 2025 Global Digital Privacy, AI, and Human Rights Landscape

For Speech Advocates: Does requiring transparency about verification, advertising, and algorithms constitute censorship? Or is transparency itself a free speech value, enabling users to make informed decisions about what they trust?

For Consumer Protection: Can platforms be allowed to fundamentally mislead users about verification systems, allowing anyone to appear "official" for a fee? Does this create fraud risks that justify regulation?

For Global Tech: Can different regulatory regimes coexist, or must there be a winner-take-all approach where either US tech self-regulation or EU regulatory oversight prevails globally?

See also: Global Approaches to Online Content Regulation: A Comparative Analysis and Navigating the Global Data Privacy Maze: A Strategic Imperative for Modern Businesses

For Democratic Oversight: Should elected governments have authority to impose transparency requirements on platforms that shape public discourse? Or does any such regulation inherently threaten the open internet?

What Happens Next

X can appeal the fine to EU courts, which could take years to resolve. The platform remains under investigation for additional DSA violations related to illegal content and information manipulation.

Meanwhile, the Trump administration has signaled it may retaliate against EU regulations it views as targeting American companies. This could escalate into a broader trade dispute affecting sectors beyond digital services.

France's digital affairs minister Anne Le Henanff called the decision "historic," stating: "By sanctioning X, Europe shows it is capable of moving from words to action."

The Central Question

The core disagreement isn't whether platforms should be regulated—even the most libertarian advocates accept some boundaries around fraud and illegality. The question is whether transparency requirements constitute legitimate consumer protection or are a Trojan horse for content control.

Critics argue that "transparency" becomes censorship when compliance costs are so high that platforms over-moderate to avoid regulatory risk. Supporters contend that without transparency obligations, platforms become unaccountable black boxes that can manipulate users without oversight.

Both sides claim to champion the same value—free expression—while disagreeing fundamentally about whether democratically enacted transparency laws protect or threaten it.

The €120 million fine against X represents the opening salvo in what promises to be a prolonged battle over these questions. Whether it's remembered as a defense of consumer rights against predatory design or as regulatory overreach stifling innovation may depend less on the law's actual provisions than on which broader vision of internet governance ultimately prevails.

Protecting Your Privacy on Social Media

While regulators and platforms battle over transparency requirements, individual users can take immediate steps to protect their privacy across social media platforms:

For content creators and influencers who must balance public visibility with personal protection, see: The Complete Guide to Influencer & Content Creator Privacy


About the Digital Services Act: The DSA entered into force in November 2022 and became fully applicable in February 2024. It applies graduated obligations based on platform size, with the strictest requirements for Very Large Online Platforms (VLOPs) serving more than 45 million monthly EU users. X was designated as a VLOP in April 2023.

The Fine Breakdown:

  • €45 million: Blue checkmark deception
  • €35 million: Advertising archive opacity
  • €40 million: Blocking researcher access
  • Total: €120 million

The fine was calculated based on the nature and duration of violations rather than as a percentage of X's global revenue, which is the typical approach for EU competition fines.

Additional Resources

For Compliance Professionals:

For Privacy-Conscious Users:

Read more

2026 Compliance Landscape: New Mandates, Enforcement Priorities & What Organizations Need to Know

2026 Compliance Landscape: New Mandates, Enforcement Priorities & What Organizations Need to Know

As we approach 2026, the regulatory environment for cybersecurity and data protection is undergoing its most significant transformation in years. From NYDFS amendments taking full effect to CIRCIA reporting requirements going live, organizations face a complex web of overlapping mandates that demand strategic planning and operational readiness. NYDFS Cybersecurity Regulation

By Compliance Hub
Generate Policy Global Compliance Map Policy Quest Secure Checklists Cyber Templates