A measure that would change the Web and our democracies forever: on Monday October 23, a seminar took place organized by the European Data Protection Supervisor (EDPS), which was entitled “ a point of no return? “. For more than two hours, different experts took turns to comment on a controversial measure in the future regulation aimed at putting an end to sexual abuse of children, the “CSAR” (for “ Child sexual abuse regulation “). If the future regulation is adopted, it would force platforms and messaging, including encrypted, to detect and report child pornography content to the authorities (called “ Child Sexual Abuse Material » or CSAM).
This new obligation, sometimes referred to as “chat control” – has sparked controversy from the start. According to many experts, it would weaken, or even put an end to, the encryption of encrypted messaging services such as WhatsApp, Signal or Telegram. On these platforms, only the recipient and the sender, holders of a key, have access to the content exchanged. Neither the messaging service itself nor the police can have knowledge of it, even if it involves organized crime, terrorism, or child pornography. By chipping away at encryption and forcing companies to implement automatic detection, the regulation could lead to widespread surveillance of all our communications, privacy advocates have warned.
“Ineffective and harmful” measures
For the EDPS it was therefore a matter of independent authority responsible for monitoring how European institutions – such as the European Commission – protect data, bringing together experts from different fields and comparing points of view. A year earlier, the EDPS and the European CNILS had already, in a joint opinion, explained that the draft regulation “ raised serious data protection and privacy concerns “. At the end of the seminar on October 23, the Controller is now going further. He concludes that there is A ” very broad and almost unprecedented consensus between (…) data protection agencies, legal experts, academia, industry, civil society, national legislators and law enforcement authorities “.
Force scanning of all emails would be ” not only ineffective, but also harmful “. Such a measure would call into question the foundations [de la vie privée] which, once shaken, would lead to radical change from which we could not return”Wojciech Wiewiórowski, the European Data Protection Supervisor, was alarmed during the seminar.
“There is a broad and almost unprecedented consensus that today’s seminar has shown – a consensus between different groups of stakeholders – that the CSAM proposal is not only ineffective, but also harmful” @W_Wiewiorowski closing the #EDPS #Seminar we #CSAM Proposal pic.twitter.com/BE5KbSvyt0
— EDPS (@EU_EDPS) October 23, 2023
A point of no return? A point of no return.
In its current form, writes the European Supervisor in his information note published on October 23, “ The CSAM proposal would fundamentally change the Internet and digital communication as we know it. This would be a point of no return “. The independent organization hopes that the voices of the experts who spoke this Monday will eventually reach the ears of the European Commission. Ylva Johansson, the European Commissioner behind the regulation, was invited to speak at this seminar, which took place within the European Parliament. But the latter, plagued by various controversies, did not respond to the invitation, said the European Controller.
Also read: Microtargeted advertising: to defend its draft regulation on the protection of minors (CSAR), has the European Commission gone too far?
During the seminar, three points crystallized the experts’ concerns and criticisms: inefficiency, technical impossibility and the risk of large-scale surveillance – with its alarming consequences for our fundamental rights.
Easily circumvented detection measures
The specialists first explained that imposing a general scan to identify child pornography content will not offer more protection to children, for a simple reason. The detection measures that would be put in place could be easily circumvented, recalled professor and cybersecurity expert Bart Preneel, from the University of Louvain in Belgium.
“ Once you know what the AI is looking for, it is possible to bypass it, and you will also be able to use the AI to bypass the AI, so there will be competition between AI tools “. Who will win? Probably the one trying to bypass the automatic scan, replied Bart Preneel.
False positives galore
Second problem raised: this type of detection system would not be reliable, because it would generate false positives. “ This technology is not error-free, but we are talking about billions of communications every day “, highlighted Alexander Hanff, computer scientist and privacy specialist. “ Even if we have an error rate of 0.1%, that’s millions of false positives or false negatives every day, which is not something we can accept in a democracy “.
These false positives risk flooding investigation services, already underwater. Additional difficulty, detection would not be limited to child pornography content (such as photos or images). It is also supposed to apply to “grooming” – a term which designates the act, for an adult, to seek to meet a minor online for sexual purposes, or with a view to producing child pornography content. However, how will this technology distinguish between minors who exchange sexts, and a minor in danger facing a child molester?
“ It would be necessary to differentiate with AI between legitimate and illegitimate content, which would be extremely difficult to do,” recognizes Professor Bart Preneel. “As a simple example, it would eventually be necessary to have a reliable way to verify age, and I think that’s a very difficult thing to do unless we mandate that everyone logs in with a state-provided identity. Which of course raises many other questions. », he adds.
In the risk-benefit balance, the account would simply not be there, explain the experts interviewed. On the one hand, citizens would be monitored much more. On the other hand, children would not be any safer.
Technical impossibility
Another challenge posed by the proposed regulation: technically, it is impossible to analyze the contents of a conversation without weakening the end-to-end encryption and without infringing on the privacy of users. “ This is the unequivocal conclusion of hundreds of leading scientists and researchers in this field. And many experts agree that current cutting-edge technology solutions for detection are not reliable enough and are also vulnerable to cyberattacks », Estimates the Controller.
This was explained by Claudia Peersman, a researcher who was part of a group of experts charged, in the United Kingdom, with evaluating systems supposed to scan encrypted messages without compromising people’s privacy. However, according to the latter, “ none of the tools were able to satisfy (our criteria). (…) We simply believe that these tools are not ready to be deployed on a large scale on private messages, within end-to-end encrypted environments “.
Finally, the widespread scanning of all our communications will generate the feeling of being constantly under surveillance – which will affect our fundamental rights. Many young adults or adolescents might give up on being politically active, to campaign, or to explore their sexuality, if the authorities were able to monitor their digital communication.
“Privacy is not an option, but an essential gateway to all our other fundamental rights”
However, recalls Ella Jakubowska, project manager at the NGO European Digital Rights (EDRi), “ privacy is not an option, but an essential gateway to all our other fundamental rights “. It is this principle of respect for private life which “ allows you to enjoy other rights (…) such as being able to express your own personality, identity and beliefs. It is from this principle that our ability to invest in political, economic, social and cultural life stems. “. For the latter, there is no need to oppose privacy and security, as the European Commission does in its proposal. It’s not about giving up privacy to keep children safe, she adds.
Implementing this type of technology would also mean that a large amount of data would be collected. How can we ensure that detection is limited to child pornography content? This is a question that no one knows how to answer, specialists explain. And more broadly, are the European Union and its member states really ready to accept mass surveillance of communications? Are they ready to blur the line between democratic countries and authoritarian regimes, asked the Controller.
On the European institutions side, no answers to these questions have yet emerged since the end of the seminar. But the lines could move this October 25. This Wednesday, members of the European Parliament’s Civil Liberties Committee are due to ask questions to European Commissioner Ylva Johansson. And this time, the politician will not be able to decline the invitation.
Also read: When Gérald Darmanin takes on encryption again, after the Arras attack
Source :
EDPS seminar of October 23, 2023