Germany has a big choice to make. The EU is debating a plan nicknamed “chat control” that would make messaging apps scan private chats to find child sexual abuse material. The aim is to protect children. But the method is like putting a camera in everyone’s living room to catch a few criminals. Once such a scanner exists, it’s hard to limit what it looks at or how it might be used later.
Strong encryption protects journalists, activists, businesses, and families. If we force apps to scan messages on our phones, we weaken that protection. New risks appear false flags that accuse innocent people, big databases that hackers will try to steal, and a tool that could be expanded to other kinds of content. Privacy by default would become checking by default, which flips a core democratic idea.
Supporters say tech must help fight awful crimes. They’re right about the goal, but blanket scanning isn’t the only option. We can focus on targeted, court‑approved investigations, faster removal of illegal content across borders, better support for survivors, and more resources for specialist police. These steps are harder than one big mandate, but they protect rights and keep encryption intact. For a related look at how identity and state access debates are unfolding in the UK, see this explainer on Britain’s compulsory digital ID and its implications Britain’s Compulsory Digital ID Explained Guide
Germany’s decision carries weight. Its vote can shape the EU outcome and send a message worldwide. A scanning mandate could push companies to weaken encryption, split features by region, or even leave markets rather than break security promises. It could also favor tech giants that can afford compliance systems, squeezing smaller privacy‑first apps and slowing innovation. There’s another path worth noting: privacy preserving identity that people control themselves. For a primer on how self‑sovereign identity can reduce data sharing and still prove “who you are,” see: Own Your Identity: Decentralized, Secure & Self‑Sovereign
The real question isn’t “privacy or children.” It’s how to protect both. Encryption is like a seatbelt: it quietly keeps people safe every day. We should improve policing and victim services without cutting that belt. If Germany supports scanning, mass inspection could become normal. If it defends end‑to‑end encryption, it shows security and rights can advance together. For the record, the EU plan at issue is the proposed permanent Child Sexual Abuse Regulation (COM(2022) 209), which includes “detection orders” that authorities could issue to providers, spelled out in Annex I of the proposal. By Oscar Harding

