A controversial push by European Union lawmakers to legally require messaging platforms to scan citizens’ private communications for child sexual abuse material (CSAM) could lead to millions of false positives a day, hundreds of security experts and privacy. I open a letter Thursday.
Concern over the EU proposal has been building since the Commission proposed the CSAM scanning plan two years ago — with independent experts, lawmakers from the European Parliament and even the bloc’s Data Protection Supervisor among those sounding the alarm .
The EU proposal does not only require scanning by messaging platforms that receive a CSAM detection mandate known CSAM? they would also have to use unspecified detection scanning technologies to try to locate unknown CSAMs and identify grooming activity as it occurs — leading to accusations of lawmakers indulging in magical levels of tech thinking.
Critics argue that the proposal is asking for the technologically impossible and will not achieve the stated goal of protecting children from abuse. Instead, they say, it will destroy Internet security and Internet users’ privacy by forcing platforms to deploy blanket surveillance of all their users by deploying dangerous, unproven technologies such as client-side scanning.
Experts say there is no technology capable of achieving what the law requires without causing far more harm than good. However, the EU is plowing independently.
The latest open letter refers to amendments to the draft CSAM scanning regulation recently proposed by the European Council, which the signatories argue do not address fundamental flaws in the draft.
The letter’s signatories—numbering 270 at the time of writing—include hundreds of academics, including well-known security experts such as Professor Bruce Schneier of the Harvard Kennedy School and Dr. Matthew D. Green of Johns Hopkins University, along with some researchers working for technology companies such as IBM, Intel and Microsoft.
An older one I open a letter (last July), signed by 465 academics, warned that the detection technologies that the legislative proposal hinges on forcing platforms to adopt are “deeply flawed and vulnerable to attack” and would lead to a significant weakening of vital protections provided by end-to-end encrypted communications (E2EE).
Little attraction to counter-proposals
Last fall, MEPs in the European Parliament united to push back with a substantially revised approach — which would limit scanning to individuals and groups already suspected of child sexual abuse. limit it to known and unknown CSAM, removing the scan requirement for grooming. and remove any risks to E2EE by limiting it to platforms that are not end-to-end encrypted. However, the European Council, the other co-legislator involved in EU law, has yet to take a position on the matter and where it goes will affect the final shape of the law.
The latest amendment on the table was tabled by the Belgian presidency of the Council in March, which is leading discussions on behalf of representatives of EU member state governments. However, in the open letter, experts warn that this proposal still fails to address the fundamental flaws in the Commission’s approach, arguing that reviews still create “unprecedented capabilities to monitor and control Internet users” and would “undermine… secure digital future for our society and could have huge implications for democratic processes in Europe and beyond.”
Amendments to be discussed in the Council’s amended proposal include a proposal that detection orders can be more targeted by implementing risk categorization and mitigation measures. and cybersecurity and encryption can be protected by ensuring that platforms are not required to create access to decrypted data and by controlling detection technologies. But the 270 experts suggest that this amounts to a hassle bordering on a security and privacy disaster.
From a “technical point of view, to be effective, this new proposal will completely undermine the security of communications and systems,” they warn. Although we rely on “flaw detection technology” to identify cases of interest in order to send more targeted detection orders, it will not reduce the risk that the law will usher in a dystopian era of “mass surveillance” of internet users’ messages. analysis.
The letter also discusses a proposal by the Council to limit the risk of false positives by defining a “person of interest” as a user who has already shared CSAM or attempted to groom a child — which is envisaged to be done through automated assessment. such as waiting 1 visit for known CSAM or 2 for unknown CSAM/treatment before the user is formally identified as a suspect and reported to the EU Center, which would handle CSAM reports.
Billions of users, millions of false positives
Experts warn that this approach is still likely to result in huge numbers of false alarms.
“The number of false positives due to detection errors is very unlikely to decrease significantly unless the number of replicates is so large that detection ceases to be effective. Given the large number of messages sent on these platforms (in the billions), one can expect a very large number of false alarms (in the millions),” they write, noting that the platforms are likely to end up being slapped with an order tracker can have millions or even billions of users, like WhatsApp owned by Meta.
“Since there has been no public information on the performance of the detectors that could be used in practice, let’s imagine that we would have a detector for CSAM and grooming, as stated in the proposal, with a false positive rate of only 0.1%. (ie, one in a thousand times, misclassifies non-CSAM as CSAM), which is much lower than any currently known detector.
“Given that WhatsApp users send 140 billion messages a day, even if only 1 in a hundred was a message checked by such detectors, there would be 1.4 million false positives every day. To reduce false positives to hundreds, statistically one would need to identify at least 5 replicates using different, statistically independent images or detectors. And that’s just for WhatsApp – if we consider other messaging platforms, including email, the number of iterations required will increase significantly to the point where CSAM sharing capabilities are not effectively reduced.”
Another Council proposal to limit detection orders to messaging apps deemed to be “high risk” is a useless revision, in the view of the signatories, as they argue that it will likely still “indiscriminately affect a huge number of people”. Here they point out that only standard functions such as image sharing and text chat are required for CSAM exchange — functions that are widely supported by many service providers, meaning that a high-risk categorization “will undoubtedly affect many services.”
They also point out that E2EE adoption is increasing, which they suggest will increase the likelihood that services distributing it will be classified as high risk. “This number may increase further with the interoperability requirements introduced by the Digital Markets Act which will result in messages flowing between low and high risk services. As a result, almost all services could be classified as high risk,” they argue. (Note: Message interoperability is a key element of the EU DMA.)
A backdoor for the backdoor
When it comes to safeguarding encryption, the letter reiterates the message that security and privacy experts have been repeatedly shouting to lawmakers for years: “Tracing into end-to-end encrypted services undermines the protection of encryption by definition.”
“The new proposal has as one of its goals to ‘protect cyber security and encrypted data while keeping services that use end-to-end encryption within the scope of detection orders.’ As we have explained before, this is an oxymoron” they emphasize. “The protection provided by end-to-end encryption means that no one other than the intended recipient of a communication should be able to learn any information about the content of that communication. Enable traceability, either for encrypted or pre-encrypted data; it violates the very definition of confidentiality provided by end-to-end encryption.”
In recent weeks, police chiefs across Europe have drawn up their own joint statement — expressing concerns about the expansion of E2EE and asking platforms to design their security systems in such a way that they can still detect illegal activity and send reports for content of messages to law enforcement authorities.
The intervention is widely seen as an attempt to pressure lawmakers to pass laws like the CSAM scanning regulation.
Police chiefs deny they are calling for backdoor encryption, but have not explained exactly what technical solutions they want the platforms to adopt to allow the intended “legitimate access”. Squaring this circle puts a very loosely shaped ball back in the legislators’ court.
If the EU continues on its current path — provided the Council does not change course, as MEPs have urged — the consequences will be “disastrous”, the letter’s signatories then warn. “It sets a precedent for internet filtering and prevents people from using some of the few tools available to protect their right to privacy in the digital space. it will have a chilling effect, especially on teenagers who rely heavily on online services for their interactions. It will change the way digital services are used around the world and is likely to negatively impact democracies around the world.”
An EU source close to the Council could not provide information on current discussions between member states, but noted that there is a working group meeting on May 8, where they confirmed that the proposal for a regulation to combat child sexual abuse will be discussed.