More trouble for European Union lawmakers in a controversial area of technology policymaking – namely the bloc’s proposed legislation to apply monitoring technologies such as client-side scanning to digital messages to try to detect child sexual abuse material (CSAM ).
This week the Commission Ombudsman published details of a find did in December for maladministration over a decision by the EU executive not to release fuller information about its communications with a child safety technology maker. Last year the Commission released some documents relating to its dealings with the company in question, but refused access to others.
The recommendation follows a complaint in June 2022 to the Ombudsman, made by a journalist, who had requested public access to documents sent to the Commission by Thorn, a US entity that sells artificial intelligence technologies that it claims can detect and remove the CSAM.
Inside her recommendationEU ombudsman Emily O’Reilly is urging the Commission to “reconsider its decision with a view to providing significantly increased, if not full, public access to the documents in question”.
“In light of the relevant ongoing legislative process and the consequent time sensitivity of this case, the Ombudsman urged the Commission to implement her recommendation expeditiously,” it adds.
The Commission presented its initial proposal for a legal framework that could oblige digital services to use automated technologies to detect and report existing or new CSAM, as well as detect and report grooming activity targeting children on their platforms, in May 2022. However, the file remains under active negotiations by EU co-legislators, the European Parliament and the Council — a factor the Ombudsman points to as an important factor in implementing transparency to promote accountability around its legislation EU.
Disclosure of the disputed documents “will allow the public to participate more effectively in a decision-making process that will very likely directly affect the daily lives of citizens by limiting their right to privacy,” he suggests. “Secondly, transparency will allow the public to check who and what informed the legislative proposal in question. Those who actively provide evidence should not be allowed to do so behind closed doors.”
Critics have suggested that the Commission’s controversial proposal to scan messages has been unduly influenced by lobbyists promoting exclusive child safety technology, who profit commercially from laws mandating automated CSAM checks.
Last autumn, a seminar organized by the European Data Protection Supervisor also heard a wide range of concerns that the Commission’s proposal is likely to be both ineffective as a tool to combat child sexual abuse and a significant risk to fundamental freedoms in a democratic society.
Lawmakers have since advocated for a revised approach to combating CSAM that would remove the requirement for messaging platforms to scan end-to-end encrypted messages, among other limits. But EU law is a threefold affair — it also requires buy-in from the Commission and the Council. So it remains to be seen where the CSAM file will land.
Asked on Monday about the Ombudsman’s recommendation that the Commission publish more of its exchanges with Thorn, the EU executive took until today (Wednesday) to send us a short reply (see below). His response suggests he plans to take time to digest the ombudsman’s finding of maladministration, as he makes a point of a generous deadline to respond to its recommendations, which is more than two months away. Which doesn’t indicate incoming fast resolution. It smells more like a can being kicked down the road.
Here is the statement, attributed to the European Commission’s spokesperson for Home Affairs, Anitta Hipper:
The Commission will provide access to documents as appropriate and within our legal framework. Specifically, with regard to the Ombudsman’s recommendation, the Commission will carefully consider the Ombudsman’s recommendation. The answer is expected by March 19.
The legislative proposal has already sparked another internal row for the Commission. Last year it landed in hot water over micro-targeted advertising, its internal affairs department was spotted running on social network X to promote the legislation – leading to a series of data protection complaints as the data used for targeting appeared to include sensitive personal information.
In November, the privacy rights group noyb filed a complaint against the Commission about this privacy watchdog, the European Data Protection Supervisor.
An internal investigation opened by the Commission after the incident was reported, meanwhile, has yet to produce public results. Every time we asked the Commission about this investigation, they said there is no update.
However, the existence of the internal investigation has had one tangible effect: the EU Ombudsman refused to open an investigation into microtargeting following a complaint by MEP Patrick Breyer last October — the her reply to the MEP O’Reilly pointed to the Commission’s ongoing investigation as “reason enough” not to investigate at that point, writing: “I note that the Commission has explained to the media that internal investigations are ongoing. Therefore, at present, I do not find sufficient grounds to initiate an investigation.”
At the same time, he agreed to open an investigation into the transfer of two employees from Europol, a pan-European law enforcement coordination agency, to Thorn — after another complaint by Breyer about a possible conflict of interest.
“I have decided to launch an investigation to investigate how Europol dealt with the transfers of two former staff members to positions related to the fight against online child sexual abuse,” he wrote. “As a first step, I have decided that it is necessary to inspect certain documents held by Europol relating to these post-service activities. I expect to receive these documents by January 15, 2024.”
It remains to be seen what the Ombudsman’s investigation into Europol’s communications with Thorn will turn out to be. (But there is, perhaps, no small irony that additional controversy surrounding the Commission’s message-scanning proposal is fueled by access to (and/or lack of) “private” communications passing between EU institutions and lobbyists of the industry. there’s a message there for policymakers, if only they’d read it.)
We reached out to Thorn, but it did not respond to a request for comment about the ombudsman’s investigation.
A piece of investigative journalism published by BalkanInsight Last fall, looking at Thorn’s lobbying and reports of communications between the Commission and Thorn that its reporters were able to obtain, he questioned the level of influence commercial manufacturers of child safety technology have gained who stand to benefit from laws that mandate the scanning of messages in relation to EU policy-making.
“After seven months of communication about access to documents and the intervention of the European Ombudsman, in early September the Commission finally published a series of email exchanges between the Directorate-General for Migration and Home Affairs and Johansson’s Thorn,” its reporters said. “The emails reveal an ongoing and close working relationship between the two sides in the months after the launch of the CSAM proposal, with the Commission repeatedly facilitating Thorn’s access to critical decision-making spaces involving ministers and representatives of EU member states.” .
The EU commissioner who spearheaded the proposal to sweep CSAM, home affairs commissioner Ylva Johansson, has repeatedly rejected allegations that she allowed industry lobbyists to influence her proposal.
Follow-up report from BalkanInsight last year, citing the minutes released under freedom of information, it found that Europol officials had pressed in a meeting with Commission staff for unfiltered access to the data they would obtain under the CSAM scanning proposal. and for the scanning systems to be used to detect other types of crime, not just child sexual abuse.
Critics of the EU’s controversial CSAM scanning proposal have long warned that once the monitoring technology is embedded in the private messaging infrastructure it will put pressure on law enforcement agencies to expand the scope of what is scanned.
