The EU Council has now passed a 4th term without passing its controversial message-scanning proposal. The just-concluded Belgian Presidency failed to broker a deal that would push forward this regulation, which has now been debated in the EU for more than two years.
For all those who have reached out to sign the “Don’t Scan Me” petition, thank you—your voice is being heard. News reports indicate the sponsors of this flawed proposal withdrew it because they couldn’t get a majority of member states to support it.
Now, it’s time to stop attempting to compromise encryption in the name of public safety. EFF has opposed this legislation from the start. Today, we’ve published a statement, along with EU civil society groups, explaining why this flawed proposal should be withdrawn.
The scanning proposal would create “detection orders” that allow for messages, files, and photos from hundreds of millions of users around the world to be compared to government databases of child abuse images. At some points during the debate, EU officials even suggested using AI to scan text conversations and predict who would engage in child abuse. That’s one of the reasons why some opponents have labeled the proposal “chat control.”
There’s scant public support for government file-scanning systems that break encryption. Nor is there support in EU law. People who need secure communications the most—lawyers, journalists, human rights workers, political dissidents, and oppressed minorities—will be the most affected by such invasive systems. Another group harmed would be those whom the EU’s proposal claims to be helping—abused and at-risk children, who need to securely communicate with trusted adults in order to seek help.
The right to have a private conversation, online or offline, is a bedrock human rights principle. When surveillance is used as an investigation technique, it must be targeted and coupled with strong judicial oversight. In the coming EU council presidency, which will be led by Hungary, leaders should drop this flawed message-scanning proposal and focus on law enforcement strategies that respect peoples’ privacy and security.
Further reading:
“AI will evaluate your texts” is a wildly dystopian idea.
Hot dog. Not a hot dog.
Only a few steps from your telly listening to you…
If it’s a “smart TV”, it already scans the content you watch and phones home about it.
That’s why I don’t have one.
The EU has some great ideas, and also some horribly bad ones
In the same way that we all want paper straws and attached bottle caps?
Much of what EU has implemented are great things that we do want, but it isn’t about what we want or dont want
I mean, what’s the problem with attached bottle caps? They’re pretty cool, and they don’t really get in the way.
From my point of view there are some that are ok and some that are terrible and get in the way no matter how you use them
i think there are bigger things to worry about
We need to have a law against using AI in enforcing the law.