Back

The EU Is Trying to Break Encryption — And Most People Have No Idea

The EU Parliament just voted to end mass message scanning — but negotiations are still live with a hard deadline in days. Here's what Chat Control actually says and why it matters for everyone's privacy.

Since 2022, the European Union has been debating a law that would require online platforms to scan private messages for illegal content. The proposal is formally called the Child Sexual Abuse Regulation. Critics call it Chat Control. After years of political deadlock, the legislation reached a critical turning point this month: on March 11, 2026, the European Parliament voted 458 to 103 to end the mass surveillance of private messages, adopting a position that any scanning must target only specific users identified by a judge — not entire populations indiscriminately. Trilogue negotiations between the EU Council, Parliament, and Commission are now underway under significant time pressure, with the current interim scanning regime set to expire on April 6, 2026.

Most people outside of digital rights circles have never heard of it.

That is a problem, because what is being decided in Brussels over the next few months has significant implications for the privacy of digital communications — not just in Europe, but potentially as a precedent for governments worldwide.


What the Law Proposes

The stated purpose of the Child Sexual Abuse Regulation is to prevent and combat child sexual abuse material online. Nobody disputes that goal. The dispute is about the method.

The core mechanism proposed is automated scanning of digital communications — messages, images, and video — to detect illegal content before or after it is transmitted. The controversy centers on how this scanning is technically supposed to work, particularly for services that use end-to-end encryption.

End-to-end encryption means that a message is encrypted on the sender's device and can only be decrypted by the recipient. The service provider — WhatsApp, Signal, your email provider — cannot read the content. This is what distinguishes genuinely private communications from services that can be read by the platform.

To scan end-to-end encrypted messages, the scanning has to happen on the device itself, before the message is encrypted. This technique is called client-side scanning. Security experts and the European Parliament's own researchers have been clear about the implication: once you install scanning software on a device that runs before encryption is applied, you have effectively broken the encryption guarantee. The message is no longer private in any meaningful sense. It can be read by whoever controls the scanning system.


Where Things Stand Now

The proposal has gone through several iterations since 2022. The most controversial versions would have made message scanning mandatory for all providers, including those using end-to-end encryption. After sustained opposition from privacy advocates, security researchers, and several EU member states — Germany was a consistent holdout — the Danish presidency of the EU Council revised the proposal in late 2025 to remove the explicit mandatory scanning requirement.

Here is where things stand as of March 2026:

  • November 26, 2025EU Council endorses revised position, dropping mandatory scanning but preserving voluntary scanning
  • December 9, 2025 — First trilogue negotiation between Council, Parliament, and Commission
  • February 26, 2026 — Second trilogue session
  • March 11, 2026 — European Parliament votes 458-103 to end mass scanning, requiring judicial authorisation for any targeted scanning
  • April 6, 2026 — Interim regulation expires — a hard deadline creating immediate pressure on negotiators
  • May 4, 2026 — Third trilogue scheduled
  • June 29, 2026 — Fourth and expected final trilogue
  • July 2026 — Formal adoption anticipated

The removal of mandatory scanning is widely described as a victory for privacy advocates. But the current version of the proposal is not without controversy.


Why the Revised Version Still Raises Concerns

The Council's revised text dropped explicit mandatory detection orders but preserved voluntary scanning — meaning platforms can choose to scan messages that are not end-to-end encrypted. It also introduced mandatory age verification requirements and what critics describe as vaguely worded "risk mitigation" obligations for encrypted services. The word "voluntary" is doing significant work here: platforms that fail to demonstrate adequate risk mitigation face regulatory consequences, which creates strong pressure to scan even without a legal mandate. In practice, the line between voluntary and compulsory becomes difficult to distinguish.

Patrick Breyer, a German digital rights lawyer and former Member of the European Parliament who has tracked the legislation closely, has warned that the structure of the revised regulation could lead to mass surveillance without formally mandating it. His concern focuses on Article 5 of the Council's mandate, which requires providers to "contribute effectively" to detecting illegal content. For encrypted services, he argues, this wording creates pressure to weaken encryption in practice, even without an explicit legal requirement.

The European Data Protection Supervisor and the European Data Protection Board have previously stated in a joint opinion that the proposal "could become the basis for de facto generalized and indiscriminate scanning of the content of virtually all types of electronic communications."

The European Court of Human Rights ruled in February 2024, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society."

The European Commission's own implementation report, published in November 2025, acknowledged that there is no proven link between scanning private messages and actual convictions or children rescued. The report also noted that perpetrators can easily migrate to other platforms where no scanning takes place.


The Technical Problem That Cannot Be Argued Away

Beyond the legal debate, there is a technical reality that the regulation cannot resolve by political compromise.

The European Parliament commissioned an independent impact assessment which concluded that there are currently no technological solutions capable of detecting child sexual abuse material without producing an unacceptably high rate of false positives. At scale — across billions of messages — even a very low false positive rate translates into millions of ordinary communications being flagged and reviewed.

The Max Planck Institute for Security and Privacy, in an analysis of client-side scanning, noted that detection software "would be embedded in the messaging app or the operating system to scan chat content and automatically forward any material flagged as prohibited to law enforcement agencies." Once content is accessible to a party other than the sender and recipient, the encryption protection is gone — regardless of what the law says about voluntary versus mandatory.

Multiple intelligence agencies across EU member states, along with cybersecurity researchers, have warned against any regulation that weakens encryption. Their concern is not abstract: encryption protects financial transactions, medical records, legal communications, journalistic sources, and political dissidents, in addition to private personal communications.


Why This Matters Beyond Europe

EU legislation tends to set standards that extend beyond EU borders. When the General Data Protection Regulation came into force in 2018, it reshaped data privacy practices globally because companies serving European users had to comply regardless of where they were based. A Chat Control regulation with teeth would create similar pressure.

If the EU establishes a legal framework that normalizes scanning private communications — even framed as voluntary, even scoped to illegal content — it provides a template for other governments to follow. Authoritarian governments do not need new ideas about surveillance. But a democratic precedent makes the argument harder to resist.

The European Parliament's March 11 vote is a significant development — it means Parliament enters the remaining trilogue sessions with a strong, clearly stated position against mass surveillance. But the final outcome depends on negotiations with EU governments, many of which have resisted restrictions on broader scanning. The Council's appetite for targeted-only scanning remains limited, and the Commission's position has not shifted. Whatever emerges from the June 29 final session could look very different from what Parliament approved.


The Bottom Line

Chat Control is not a fringe proposal. It is active legislation, currently in final negotiations, with a scheduled conclusion date. The most aggressive version — mandatory scanning of all encrypted messages — has been scaled back. But the revised version still contains provisions that privacy experts, the EU's own data protection authorities, and independent security researchers have flagged as fundamentally incompatible with private communications.

The argument on the other side is genuine: child sexual abuse material causes real harm, and detection and removal matters. The disagreement is not about the goal. It is about whether mass scanning of private communications is an effective, proportionate, or legally permissible way to achieve it — and whether, once the infrastructure exists, it stays limited to that purpose.

Those are questions worth understanding before the law is passed.

By the time most people hear about Chat Control, the decision will already have been made.