A critical decision looms for Germany as it weighs a new EU policy dubbed ‘chat control’. This plan would require messaging apps to scan private messages to detect child sexual abuse material (CSAM). While this aims to protect children, the approach has raised concerns about privacy and potential misuse of such technology. Implementing widespread scanning creates a chilling effect on encryption used by journalists, activists, businesses, and families alike. The unintended consequences could be significant, leading to false flag accusations against innocent users, large databases vulnerable to hacking, and a tool that can potentially expand beyond CSAM to other content. Instead of blanket scanning, experts advocate for targeted investigations, faster removal of illegal content across borders, improved support systems for victims, and increased resources dedicated to specialized police units. These alternatives offer solutions that protect rights while maintaining encryption and promoting innovation without compromising on privacy.