EU Chat Control Passes Committee on November 26, 2025: "Voluntary" Surveillance, Mandatory Age Verification, and the Political Deception That Got It Through

EU Chat Control Passes Committee on November 26, 2025: "Voluntary" Surveillance, Mandatory Age Verification, and the Political Deception That Got It Through
Photo by Thomas Lohmann / Unsplash

Published: November 27, 2025

Executive Summary

On November 26, 2025, EU ambassadors in the Committee of Permanent Representatives (COREPER) approved a revised Chat Control proposal by a close split vote—but contrary to celebratory headlines claiming the EU "backed away" from mass surveillance, the approved text represents what privacy experts are calling the most dangerous version yet. While removing explicit mandatory scanning requirements, the Danish compromise creates a "toxic legal framework" that incentivizes voluntary mass surveillance, eliminates online anonymity through mandatory age verification, and threatens to exclude teenagers from digital life entirely.

The approval follows three previous failed attempts since 2022, with this version succeeding precisely because it disguises its surveillance mechanisms behind euphemistic language. As MEP Patrick Breyer warns: "Chat Control is not dead, it is just being privatized."

For anyone following our extensive coverage of Chat Control and the October 2025 blocking minority that temporarily stopped the proposal, this development represents a concerning evolution: surveillance proponents learned from their defeats and adapted their tactics.

Context for Today's Development: This is distinct from but related to yesterday's vote on social media age limits, demonstrating a coordinated multi-front assault on digital privacy across different EU institutions and processes.

Breaking: High Court Challenge Threatens Australia’s World-First Social Media Ban
Two Teenagers Lead Constitutional Fight as December 10 Deadline Looms November 27, 2025 — In a dramatic development that could reshape Australia’s controversial under-16 social media ban, the High Court of Australia has accepted a constitutional challenge filed by the Digital Freedom Project alongside two teenage plaintiffs, Noah Jones and Macy

What Actually Got Approved: Breaking Down the Deception

The Official Story vs. The Reality

What Headlines Are Saying:

  • "EU backs away from mandatory Chat Control"
  • "Denmark proposes voluntary scanning compromise"
  • "Privacy concerns addressed in revised proposal"

What Actually Happened: On November 26, 2025, COREPER approved a negotiating mandate that creates three interlocking threats to digital privacy:

  1. Privatized Mass Surveillance: Making the temporary "Chat Control 1.0" framework permanent
  2. Death of Anonymous Communication: Mandatory age verification requiring ID for all digital services
  3. Digital Exclusion of Teenagers: Effective ban on users under 17 accessing communication platforms

Why "Voluntary" Is a Lie

The revised text removes explicit mandatory detection orders but introduces Article 4's requirement for providers to take "all appropriate risk mitigation measures" to ensure safety. This isn't a softening—it's linguistic camouflage for the same outcome.

Here's why "voluntary" scanning becomes effectively mandatory:

The Coercive Framework:

  • Providers must conduct risk assessments of their services
  • Risk categorization determines whether services are "high risk"
  • "High risk" services face mandatory mitigation obligations
  • Voluntary scanning is explicitly listed as a mitigation measure in determining risk category
  • Translation: If you don't "voluntarily" scan, authorities classify you as high risk, which triggers mandatory mitigation measures that include...scanning

As one privacy expert put it: "You are not required to volunteer to scan—but your required mitigation measures may include scanning voluntarily. This is logically impossible. Voluntary ≠ a component of required obligations."

The Exemption Scandal: Privacy for Me, But Not for Thee

Perhaps the most damning evidence of bad faith comes from leaked documents showing EU officials exempt themselves from the surveillance regime. According to provisions in the proposal, "interpersonal communications services that are not publicly available, such as those used for national security purposes, should be excluded from the scope of this Regulation."

This means:

  • Politicians' messages: Not scanned
  • Military communications: Not scanned
  • Intelligence agency chats: Not scanned
  • Your family photos on WhatsApp: Scanned

As Patrick Breyer noted after the October vote blocking: "The fact that EU interior ministers want to exempt police officers, soldiers, intelligence officers and even themselves from chat control scanning proves that they know exactly just how unreliable and dangerous the snooping algorithms are that they want to unleash on us citizens."

This creates exactly the two-tiered surveillance structure that authoritarian regimes employ: privacy for those in power, surveillance for everyone else.

The Three Hidden Dangers Nobody's Talking About

1. Permanent Mass Surveillance Under "Voluntary" Chat Control 1.0

The current temporary framework allows companies like Meta, Google, and Microsoft to scan private communications voluntarily. The approved text makes this permanent.

What This Means:

  • US tech giants can continue indiscriminate, warrantless scanning of millions of Europeans
  • No court order required
  • No reasonable suspicion needed
  • Scanning includes text messages, photos, videos, and metadata
  • Algorithms with proven 48-50% error rates as documented in Germany's 2024 data

The False Positive Crisis: Germany's Federal Criminal Police Office (BKA) reported that in 2024, 99,375 of 205,728 reports were criminally irrelevant—a 48.3% error rate. That's 99,375 innocent people who had private content flagged and forwarded to authorities. These aren't just statistics—they're families whose beach photos, medical images shared with doctors, or artistic photography got sent to law enforcement for investigation.

2. The Death of Anonymous Communication: Mandatory Age Verification

To comply with requirements to "reliably identify minors," providers will be forced to verify the age of every single user.

The Reality:

  • Every citizen must upload an ID or undergo face scan to access email or messenger services
  • Creates de facto ban on anonymous communication
  • Eliminates whistleblower protection channels
  • Destroys privacy for abuse victims seeking help
  • Ends anonymous political speech

Technical Impossibility: Over 400 scientists have warned that "age assessment cannot be performed in a privacy-preserving way with current technology due to reliance on biometric, behavioural or contextual information...In fact, it incentivizes (children's) data collection and exploitation."

This directly contradicts claims that the system protects privacy. You cannot reliably verify age without collecting invasive personal data.

3. Digital House Arrest for Teenagers

The proposal bars users under 17 from using apps with chat functions—including WhatsApp, Instagram, TikTok, and popular online games—unless stringent conditions are met.

The Consequence:

  • Isolation from social circles
  • Exclusion from digital education tools
  • Inability to participate in online community activities
  • "Protection by exclusion" rather than education and empowerment

As Breyer stated: "Protection by exclusion is pedagogical nonsense. Instead of empowering teenagers, the Council wants to lock them out of the digital world entirely."

How Denmark Deceived Europe: The False Crisis Narrative

Manufacturing Urgency Through Lies

According to leaked meeting minutes obtained by Netzpolitik, Danish Justice Minister Peter Hummelgaard told interior ministers that the European Parliament would refuse to extend current voluntary scanning provisions unless governments first agreed to the Chat Control 2.0 mandatory scanning regime.

This claim was demonstrably false.

Former MEP Patrick Breyer, who co-negotiated the Parliament's position, stated: "This is a blatant lie designed to manufacture a crisis. There is no such decision by the European Parliament. There has not even been a discussion on this issue."

The manufactured crisis created false urgency: "We must act now or lose child protection tools entirely!" This pressure tactic pushed wavering member states toward approval.

The Timeline of Deception

September 2025: Chat Control fails for the third time after Germany and Luxembourg join blocking minority (detailed coverage)

October 2025: Denmark drops mandatory scanning from proposal after blocking minority secures victory (analysis)

November 5, 2025: Denmark introduces "compromise" removing detection obligations but adding risk mitigation framework

November 12, 2025: Law Enforcement Working Party discusses new text with "broad support"

November 13, 2025: Denmark clarifies chat control shouldn't be mandatory "even through the back door of risk mitigation"—but the text still contains exactly that mechanism

November 26, 2025: COREPER approves mandate in close split vote

The Vote Breakdown: Who Stood for Privacy?

The approval came via a close split vote, with three countries voting against and one abstaining:

Against:

  • Czech Republic
  • Netherlands
  • Poland

Abstaining:

  • Italy

Context: This is far from the unanimous support Denmark claimed to have secured. The close vote reveals deep divisions within the EU about the legality and proportionality of the measure.

What's Next: The Trilogue Danger

The Fast-Track Timeline

The Council's approved mandate now moves to trilogue negotiations between:

  1. The Council (representing member state governments)
  2. The European Parliament
  3. The European Commission

Target Timeline: Finalize text before April 2026 when current voluntary scanning provisions expire

Why Parliament's Opposition May Not Save Us

The European Parliament adopted its position in November 2023, explicitly ruling out:

  • Indiscriminate scanning
  • Breaking end-to-end encryption
  • Mass surveillance without suspicion

However, Parliament has historically compromised on surveillance laws after political pressure during trilogues. The Council and Commission are now aligned—both want stronger online monitoring. This alignment is precisely what makes privacy groups nervous about a rushed trilogue where Parliament gives in to urgency.

Privacy advocates fear the manufactured April 2026 "crisis" will pressure Parliament negotiators to accept the Council's framework rather than allow voluntary scanning provisions to expire.

The Game Theory of Surveillance: Why Abuse Is Inevitable

Once a surveillance system exists, every actor with access faces the same choice: exploit the system for advantage, or trust everyone else will voluntarily restrain themselves.

The Payoff Structure:

  • Exploiting surveillance = High gain (political leverage, insider information, opponent suppression)
  • Cost of being caught = Low or distant
  • Cost of NOT exploiting when others do = Losing power to those who do

This creates a classic defection equilibrium where restraint is not the stable strategy. Over time, the probability that no one ever misuses the surveillance system approaches zero.

Historical Precedent: Every surveillance system ever created has been abused beyond its stated purpose:

  • USA PATRIOT Act: Marketed for terrorism, used for drug enforcement
  • UK's Investigatory Powers Act: Expanded to dozens of agencies
  • France's emergency surveillance: Made permanent
  • NSA's metadata collection: Revealed by Snowden as far beyond authorized scope

The question is never "will this be misused?" The question is "when and by whom?"

The Technical Reality: Client-Side Scanning Cannot Work Safely

Why Encryption Experts Say It's Impossible

Over 500 cryptographers and security researchers signed an open letter declaring Chat Control "technically infeasible" and warning it creates "unprecedented capabilities for surveillance, control, and censorship."

The Core Problem: Client-side scanning requires installing monitoring software on users' devices that reads content before encryption. This fundamentally breaks the security model of end-to-end encryption, which promises only sender and recipient can read messages.

As Signal's Meredith Whittaker explained: "Scanning every message—whether you do it before, or after these messages are encrypted—negates the very premise of end-to-end encryption. Rather than having to break Signal's encryption protocol, attackers and hostile nation states would only need to piggyback on the access granted to the scanning system."

The Single Point of Failure

Creating a backdoor "only for child protection" creates a vulnerability that:

  • Hackers can exploit
  • Foreign intelligence can leverage
  • Criminal organizations can compromise
  • Future authoritarian governments can abuse

There is no such thing as a backdoor that only good guys can use.

Connecting the Dots: The Coordinated Assault on Digital Privacy

Multiple Fronts, Same Goal

It's critical to understand today's COREPER approval in context of broader coordinated efforts:

1. European Parliament Social Media Age Limits (November 26, 2025) Yesterday's vote established age 16 minimum for unrestricted social media access, requiring real age verification through EU Digital Wallets.

2. Chat Control Council Mandate (November 26, 2025)
Today's approval creates framework for mandatory age verification across ALL communication services.

3. Ireland's Surveillance Agenda As documented in our extensive coverage, Ireland is simultaneously:

  • Building digital identity infrastructure (wallet system)
  • Pursuing aggressive content moderation enforcement (DSA investigations)
  • Advancing surveillance legislation (private communications monitoring)
  • Developing censorship frameworks (media monitoring laws)

4. Australia's Social Media Ban Australia's December 10, 2025 implementation of under-16 social media ban creates global precedent.

The Pattern: Child Protection as Trojan Horse

Every expansion begins with child protection justifications:

  1. Age verification requires identity systems
  2. Identity systems create population-wide databases
  3. Databases enable tracking and monitoring
  4. Monitoring infrastructure extends to speech control
  5. Speech control becomes normalized enforcement

The infrastructure built for age verification serves multiple purposes—many having nothing to do with protecting children.

Why the "I Have Nothing to Hide" Argument Is Dangerous

The Dignity Argument

There is no greater violation of human dignity than forcing a person to live under permanent inspection. When AI systems are granted authority to scan your most private messages, photos, and thoughts, something fundamental breaks: the boundary between inner life and outside world.

As Edward Snowden stated: "Arguing that you don't care about privacy because you have nothing to hide is like saying you don't care about free speech because you have nothing to say."

Private messages aren't "data"—they're the raw material of human beings:

  • Fears, desires, doubts
  • Intimate moments
  • Fragments of identity that exist because we believe they remain unseen

When algorithms judge, classify, and flag these, you cease being a person with private inner world and become an object of analysis. Your conversations become datasets. Your memories become evidence. Your relationships become patterns.

The Practical Argument

Even if you personally have nothing to hide today:

1. Standards change: What's legal today may be prosecuted tomorrow. Historical examples abound of governments criminalizing previously protected activity.

2. False positives destroy lives: With 48-50% error rates, innocent people become suspects. Once flagged, clearing your name is difficult even when you've done nothing wrong.

3. Chilling effects: Knowing you're watched changes behavior. People self-censor. Important conversations don't happen. Dissent dies.

4. Power imbalances: Those with access to surveillance data gain leverage over everyone else. This invites blackmail, corruption, and abuse.

5. Mission creep: Every surveillance power expands. Built for child protection, used for everything else.

What Organizations and Individuals Should Do Now

For Social Media Platforms and Communication Services

Immediate Actions:

  • Review how risk assessment requirements apply to your service
  • Assess exposure to "high risk" classification based on encryption/anonymity
  • Evaluate technical feasibility of EU Digital Wallet integration
  • Model business impact of mandatory age verification
  • Identify senior management liability exposure

Strategic Planning:

  • Develop contingency for refusing to implement client-side scanning
  • Consider relocating services outside EU jurisdiction
  • Join industry coalitions opposing the regulation
  • Prepare legal challenges to implementation requirements

Communications:

  • Be transparent with users about surveillance obligations
  • Clearly explain privacy implications of compliance
  • Don't gaslight users by claiming broken encryption is still secure

For Governments and Civil Society

Advocacy Actions:

  • Contact MEPs before trilogue negotiations begin
  • Demand Parliament hold firm on November 2023 position opposing mass surveillance
  • Highlight the exemption scandal—demand equal privacy for all or none
  • Document and publicize false positive rates from current voluntary scanning
  • Support legal challenges in member states with constitutional privacy protections

Strategic Priorities:

  • Expose the "voluntary" framework's coercive nature
  • Challenge manufactured crisis narrative around April 2026 expiration
  • Build blocking minority for any trilogue compromise weakening Parliament position
  • Prepare constitutional challenges in national courts

For Individuals and Families

Privacy Protection:

  • Move to end-to-end encrypted services that refuse to implement client-side scanning (Signal has threatened EU exit)
  • Use VPNs to obfuscate communication patterns
  • Practice good operational security (OPSEC) for sensitive discussions
  • Educate family members about surveillance implications
  • Consider multiple communication channels for different sensitivity levels

Political Action:

  • Contact elected representatives expressing opposition
  • Support organizations like Fight Chat Control and EDRi
  • Share information about the proposal's real implications
  • Correct misinformation claiming it's "voluntary" or "privacy-preserving"
  • Vote for representatives who oppose mass surveillance

The Broader Implications: What This Means for Democracy

Inversion of Political Sovereignty

Democratic decision-making happens in both public and private: in conversations, debates, doubts, plans, disagreements, and countless invisible dialogues that precede every meaningful political act.

When private communication is monitored, the people's sovereignty is quietly transferred to those who control the surveillance infrastructure. The sovereign—the people—become the surveilled. The government—supposedly the people's servant—becomes the watcher.

This inversion is the hallmark of authoritarianism masquerading as democracy.

The Disintegrative Phase

As complexity scientist Peter Turchin would argue, societies don't collapse because of one bad law—they collapse when their elites, losing legitimacy, turn coercive tools inward.

This is exactly what Chat Control represents:

  • Elite overproduction (Brussels bureaucrats divorced from popular will)
  • Rising instability (massive protests against surveillance)
  • Loss of societal cohesion (two-tiered privacy system)
  • State responding with repression rather than renewal

A confident political order doesn't need to monitor every private message. Only a system aware of its own internal weakness reaches for such powers.

By trying to control its population instead of restoring trust, the EU is accelerating the very instability it fears.

Conclusion: The Fight Isn't Over

The November 26, 2025 COREPER approval is a significant setback for digital privacy, but it's not the final word. The text now moves to trilogue negotiations where the European Parliament still has opportunity to reject the Council's surveillance framework.

Critical Timeline:

  • Now - April 2026: Trilogue negotiations
  • April 2026: Current voluntary scanning provisions expire (manufactured "crisis" point)
  • Late 2026: Possible final adoption if Parliament compromises

What Must Happen:

  1. Parliament must hold firm on November 2023 position excluding indiscriminate scanning
  2. Member states must reject false crisis narrative
  3. Technical experts must continue documenting impossibility of secure client-side scanning
  4. Citizens must flood representatives with opposition
  5. Platforms must refuse to implement rather than break encryption

The Reality Check: Contrary to headlines suggesting Chat Control is dead or softened, the approved text creates the legal framework for:

  • Privatized mass surveillance of 450 million Europeans
  • Death of online anonymity through mandatory ID verification
  • Exclusion of teenagers from digital life
  • Two-tiered system where elites keep privacy while citizens are surveilled

As Patrick Breyer concluded: "We must stop pretending that 'voluntary' mass surveillance is acceptable in a democracy. We are facing a future where you need an ID card to send a message, and where foreign black-box AI decides if your private photos are suspicious. This is not a victory for privacy; it is a disaster waiting to happen."

The window to stop this is still open—but it's rapidly closing.

Privacy is the light they are trying to dim. We can make it shine brighter.


Our Chat Control Coverage

Technical and Constitutional Context


About the Author: This analysis is part of our ongoing coverage of digital surveillance, privacy rights, and cybersecurity regulation at MyPrivacy.blog and ComplianceHub.wiki.

Stay Informed: Subscribe to our newsletter for updates on Chat Control negotiations, digital identity implementation, and privacy legislation developments.

Take Action: Visit Fight Chat Control to contact your representatives and join the resistance to mass surveillance.

Sources: Patrick Breyer press releases, COREPER meeting minutes, EU Council documentation, Netzpolitik reporting, TechRadar coverage, and statements from privacy organizations EDRi, Signal Foundation, and encrypted service providers.

Read more

France's Encryption War Escalates: GrapheneOS Exodus Signals Dangerous Precedent for Open Source Privacy Tech

France's Encryption War Escalates: GrapheneOS Exodus Signals Dangerous Precedent for Open Source Privacy Tech

Executive Summary: The GrapheneOS project's dramatic withdrawal from France in November 2025 represents a watershed moment in the escalating global conflict between privacy technology and state surveillance powers. This case follows an established pattern of French law enforcement targeting encrypted communications platforms, but marks the first time authorities

By Compliance Hub
Generate Policy Global Compliance Map Policy Quest Secure Checklists Cyber Templates