VPN News & Analysis
EU Chat Control is back: How mass scanning laws threaten VPNs, encryption and your private messages
The European Union is once again pushing forward with its most controversial digital surveillance proposal in years. Under the harmless-sounding name Child Sexual Abuse Regulation (CSAR) – better known as "Chat Control" – the EU is exploring laws that would normalize client-side scanning of private messages, weaken end-to-end encryption, and potentially create new pressure on VPN providers to log and deanonymise users.
For anyone in Europe who relies on VPNs, secure messaging apps or encrypted email, this is not an abstract policy debate. It is a direct, structural risk to the technologies you use to keep your data and communications private.
In this analysis we will explain, in practical and technical terms, what Chat Control is, how it interacts with the newer ProtectEU internal security roadmap, and what this combination could mean for VPN logging, encryption backdoors and the future of online privacy in Europe.
TL;DR – Why VPN users should care about Chat Control
- Chat Control (CSAR) would allow authorities to order scanning of private chats, emails and photos, including on end-to-end encrypted services, using client-side scanning.
- The newer ProtectEU roadmap goes further, arguing for broader data retention, easier access to encrypted data and restrictions on anonymity tools.
- Client-side scanning breaks the core guarantees of end-to-end encryption by inserting surveillance code on your device before messages are encrypted.
- Proposals linked to CSAR and ProtectEU explicitly mention VPNs and anonymity services as problems to be "solved" – potentially via logging and identification requirements.
- Even if the current drafts fail, the political direction is clear: mass-scanning infrastructure will keep coming back under different names until it is clearly rejected.
1. What exactly is "Chat Control"?
The CSAR proposal was first presented by then–Home Affairs Commissioner Ylva Johansson in May 2022. The stated goal is to combat the spread of child sexual abuse material (CSAM) online. To do this, the regulation would create a new system of "detection orders" that can be issued to online platforms and communication providers.
Under various drafts analysed by civil-rights groups and VPN providers, these detection orders could require companies to:
- Scan private messages, photos, videos and files for known CSAM (hash matching).
- Use AI/ML classifiers to identify "new" CSAM that does not match existing hashes.
- Detect "grooming" patterns in text conversations between adults and minors.
- Report matches to a central EU body and national law-enforcement agencies, who would then decide whether to open investigations.
The most controversial aspect is that this scanning would apply even to end-to-end encrypted (E2EE) services. To achieve that, the EU and several member states are pushing for client-side scanning (CSS): code running directly on your device that analyses content before it is encrypted and sent.
The European Parliament's civil-liberties committee (LIBE) has already rejected indiscriminate scanning and amended its position to protect end-to-end encryption, insisting that any surveillance must be targeted and based on warranted suspicion. However, the Council of the EU (member states) has repeatedly tried to reintroduce mass-scanning through vague language about "risk mitigation" and "high-risk services".
A detailed technical and legal breakdown of the proposal is available from Mullvad VPN's campaign page "And Then?" and from Pirate Party MEP Patrick Breyer's analysis here .
2. Why client-side scanning breaks end-to-end encryption
End-to-end encryption is a very simple idea: only the communicating endpoints possess the keys to decrypt messages. Even the service provider cannot read the contents. Client-side scanning deliberately breaks this model by inserting a new actor – the scanning code – on your device.
In practice, a CSAR-style client-side scanning system would work like this:
- You type a message or attach a photo in a messaging app.
- Before encryption, a scanning module compares the content against a database of known CSAM hashes and runs machine-learning models to classify it.
- If the classifier flags the content as suspicious, the app uploads the plaintext and metadata for review.
- Only after this step is complete is the content encrypted and sent to the recipient.
From a systems-security perspective, this is not a "sniffer dog" standing next to your messages; it is a general-purpose backdoor that can be repurposed for any scanning function regulators want later.
Security researchers, the Electronic Frontier Foundation (EFF) and over 300 cryptographers have warned that once you introduce client-side scanning, you have destroyed the promise of end-to-end encryption:
- The scanning code and its update channel become a high-value target for attackers, spyware vendors and hostile states.
- Any vulnerability in the scanner gives an attacker pre-encryption access to messages and files.
- The scanner can be silently repurposed to search for anything: political speech, leaks, "extremism", copyright violations, or dissident activity.
Even EU institutions acknowledge the danger. Drafts discussed under the Danish Council presidency would exempt state and government accounts from scanning, precisely because of the unacceptable risk of exposing official communications. If the backdoor is too dangerous for governments, it is even more dangerous for ordinary citizens.
3. False positives and real-world harm
Proponents often argue that any privacy trade-off is justified if it helps rescue children. Unfortunately, empirical data from existing CSAM detection systems shows that false positives are a serious and systemic problem.
- In Germany, law-enforcement data reported by compliance specialists shows error rates around 48 % in some automated detection systems – meaning nearly half of reports were not actually illegal content.
- In Ireland, only 852 out of 4,192 automated reports forwarded to authorities in 2022 involved actual CSAM – an accuracy rate of roughly 20 %.
- The Swiss Federal Police and Dutch authorities have similarly warned that holiday photos, family pictures and consensual teen-to-teen sharing are routinely misclassified.
Every false positive is not an abstract statistic. It is a real person whose communications are flagged, whose device may be seized, and whose life can be upended while investigations grind on. At the same time, analysts are flooded with noise, making it harder to focus on genuine abuse cases.
The EU's own legal services have warned that such generalised, suspicionless scanning of private communications is likely incompatible with Articles 7 and 8 of the EU Charter of Fundamental Rights (privacy and data protection), and with the European Court of Human Rights' case law on mass surveillance.
4. Lobbying, conflicts of interest and "ChatControlGate"
Beyond the technical issues, the legislative process behind CSAR has been dogged by accusations of opaque lobbying and conflicts of interest. Investigations by European media and digital-rights organisations have documented a tight feedback loop between the Commission and a network of organisations that both advocate for Chat Control and sell the technology needed to implement it.
Key findings include:
- U.S.-based nonprofit Thorn, co-founded by actor Ashton Kutcher, has been one of the loudest advocates for Chat Control. According to reporting by outlets including Fortune and Le Monde, Thorn held numerous high-level meetings with EU officials while simultaneously marketing its proprietary scanning system Safer.
- Thorn and allied organisations are funded in part by the Oak Foundation, which has invested tens of millions of dollars in child-protection tech and lobbying efforts related to CSAR.
- Emails obtained via freedom-of-information requests show Commissioner Johansson thanking Thorn for its help in crafting the proposal and describing the regulation as a response that "would not exist" without them.
- Meanwhile, established child-protection hotlines and independent European civil-society organisations report that their expertise was largely ignored during the drafting phase.
European Digital Rights (EDRi) and others have referred to this as "ChatControlGate" – a textbook example of regulatory capture where those who stand to profit from a law are deeply involved in writing it.
For a detailed timeline of the lobbying network, see EDRi's analysis "How a Hollywood star lobbies the EU for more surveillance" and the joint investigation summarised on Tuta Mail's blog .
5. From Chat Control to ProtectEU: building a permanent surveillance stack
Even as CSAR struggles to clear the Council, the Commission has launched a broader initiative called ProtectEU. Framed as a strategy for "internal security" and "lawful access" to data, ProtectEU is best understood as the next layer in the same surveillance architecture.
According to analysis by Italian digital-rights group Osservatorio Nessuno and others, the ProtectEU roadmap aims to:
- Revisit and expand data retention rules, pushing towards more systematic and longer-term storage of telecommunications metadata, despite repeated court rulings against indiscriminate retention.
- Promote "solutions" that "facilitate access to encrypted data" by 2030 – in practice, this means persistent pressure on providers to weaken or bypass end-to-end encryption.
- Encourage development of AI-based forensic tools that can sift through large data sets – often without adequate safeguards against bias, misuse or unlawful evidence gathering.
- Treat privacy and anonymity tools as obstacles rather than fundamental components of a democratic digital infrastructure.
A leaked high-level group recommendation cited by the Commission goes even further, suggesting requirements for universal identification, permanent logging of electronic communications, and legal penalties for services that refuse to build surveillance capabilities into their products.
In that context, Chat Control is not a standalone aberration. It is one piece of a broader shift towards normalising mass surveillance infrastructure across the EU, with VPNs and encryption explicitly in the crosshairs.
6. Where the law stands in November 2025
After multiple failed attempts to reach consensus in the Council, the "classic" version of Chat Control looked close to collapse in early 2025. However, the Danish EU presidency, which took over in July 2025, made CSAR a flagship priority and reintroduced a heavily tweaked version in the autumn.
As of mid-October 2025:
- A planned Council vote on 14 October was cancelled when a blocking minority of member states, led by Germany and Luxembourg, refused to support encryption-breaking provisions.
- According to reporting by Brussels Signal, the rough tally was 12 states in favour, 8 opposed and 7 undecided – short of the qualified majority needed.
- The latest Danish drafts attempted to remove explicit mention of "mandatory" scanning while maintaining the mechanisms that would make scanning de facto unavoidable for large platforms.
VPN-friendly MEPs and civil-society coalitions have called this a strategy of "backdoor reintroduction", where the core architecture of mass scanning is preserved but wrapped in more ambiguous language. The battle now is less about the headline term "Chat Control" and more about the underlying client-side scanning and data-retention machinery.
Looking ahead, negotiations between the Commission, Council and Parliament (the so-called trilogues) are expected to run into early 2026. There is still time for voters, technical experts and businesses to influence the final outcome – but the window is narrowing.
7. What does all this mean for VPNs and privacy tools?
From a VPN and network-security perspective, the combination of CSAR and ProtectEU presents several distinct risks.
7.1 Client-side scanning cannot be "VPNed away"
A VPN encrypts traffic between your device and a VPN server. It does not control what your operating system or apps do before that point. If client-side scanning is mandated and implemented at the OS or app level, then:
- Your VPN will still protect you against ISP-level and network-level surveillance, which remains important.
- But the scanner on your device will see messages and files before they are encrypted, and can upload them in plaintext regardless of your VPN.
- In the worst case, scanning code could even treat VPN use as a risk signal, triggering additional scrutiny.
This is why so many security professionals consider client-side scanning to be a categorical red line. It changes your device from a tool you control into an endpoint for a distributed surveillance system.
7.2 Pressure on VPN logging and identification
Proposals around ProtectEU and related high-level recommendations frequently mention "anonymity services" and "obstacles to lawful access". In practice, this often means VPNs, Tor and similar tools. Depending on how future legislation is written and enforced, we could see:
- Mandatory connection logging for VPN providers operating in or targeting the EU market, including retention of source IPs, timestamps and exit IPs.
- Identification requirements (for example via SIM-linked phone numbers or eID) for VPN accounts, making truly anonymous subscriptions difficult.
- De facto market bans for strict no-logs providers that refuse to implement logging and identification, especially if they have a significant EU customer base.
- Increased legal risk for companies offering privacy-enhancing infrastructure (self-hosted VPNs, privacy relays, onion gateways) without logging.
Not all of these ideas have made it into concrete law yet, but they are clearly visible in the policy conversation around ProtectEU and "lawful access". For VPN users, the key point is that jurisdiction and legal posture matter more than ever.
7.3 Increased importance of audits, open source and multi-jurisdiction setups
If you are choosing a VPN in Europe in 2025, you should no longer assume that a friendly-looking privacy policy is enough. In a world of shifting surveillance laws, you want providers that:
- Undergo regular third-party audits of their infrastructure and no-logs claims.
- Publish open-source clients (and ideally core libraries) so the community can review how they handle configuration, updates and telemetry.
- Have multi-jurisdiction architectures, so that they can move sensitive operations out of hostile legal environments if necessary.
- Are willing to publicly challenge overreaching orders in court rather than silently complying.
For a broader overview of how VPNs actually work and what they can – and cannot – protect you from, see our guide What Is a VPN? Complete Guide to Virtual Private Networks in 2025.
8. Practical steps for VPN users in Europe
While the final shape of CSAR and ProtectEU is still in flux, there are concrete actions you can take now to harden your setup and support privacy-friendly outcomes.
8.1 Harden your personal privacy stack
- Use a reputable VPN with independently audited no-logs claims and a clear, technically detailed privacy policy.
- Prefer messaging apps and email providers that publicly oppose client-side scanning and have a track record of defending end-to-end encryption.
- Keep your operating systems and apps updated, but pay close attention to new "safety" features that might introduce scanning or intrusive telemetry.
- Reduce your data exhaust by using privacy-focused browsers, hardened settings and separate profiles for different activities.
If you're actively shopping for a VPN during Black Friday and Cyber Monday, our best VPN deals overview highlights providers with strong technical security and transparent policies.
8.2 Stay informed and support digital-rights groups
- Follow updates from organisations like EDRi , Fight Chat Control , the Electronic Frontier Foundation and national digital-rights NGOs.
- Use tools like fightchatcontrol.eu to contact your MEPs and national ministers, especially if you live in a country that has not taken a clear position.
- Pay attention to how your preferred VPN and secure-communication providers communicate about CSAR and ProtectEU. Providers that stay silent on such fundamental issues are making a statement too.
8.3 Separate marketing myths from threat models
Finally, it is important to keep a realistic mental model of what VPNs and encryption can do under pressure from new laws. A VPN is still an essential part of a modern privacy toolkit, but it is not a magic invisibility shield.
- Assume that platform owners and OS vendors can see more than your VPN provider can.
- Treat any system with client-side scanning as effectively compromised from a confidentiality perspective, no matter what the marketing says.
- Diversify your risk: avoid putting all your sensitive traffic, storage and identity in a single jurisdiction or with a single company.
9. The bottom line
The EU's Chat Control and ProtectEU initiatives are part of a longer-term trend: reframing privacy and strong encryption as obstacles to security, rather than as preconditions for a free and safe digital society. For VPN users, this means that the landscape is shifting from "Which provider is fastest?" to "Which provider can survive legal pressure without betraying my trust?"
The good news is that this is not inevitable. The European Parliament has already pushed back hard against mass scanning. Several member states have blocked dangerous drafts. Hundreds of experts have publicly documented the flaws in client-side scanning. But the proposals will keep coming – renamed, repackaged and rebranded – until there is a clear, durable political decision that mass surveillance of private communications is off the table.
Until then, the best you can do is to stay informed, choose tools and providers that are willing to fight for encryption, and make your voice heard in the legislative process. VPNs remain a critical layer of defence – but they need a legal environment that respects privacy to be truly effective.