Two of the main community institutions, the European Parliament and the European Council, are debating the draft of a new regulation that could require technology companies to automatically scan all private communications —every message, photo, video, post or email exchanged in Europe will be scanned— of users searching for content related to child pornography.
Although known as ‘Control Chat’, its official name is “Regulations establishing standards to prevent and combat the sexual abuse of minors.” But, according to the cyberactivist organization XNET,
“Contrary to what its title indicates, There is not a single line in this legislation on how to prevent and combat sexual abuse of minors“.
Some of its seemingly reasonable measures (mandatory age verification for messaging applications, exclusion of minors from the installation of many applications, etc.) are necessarily accompanied by prior identification of the user, making online anonymity de facto impossible.
The regulations, furthermore, leave very few fields uncovered with their preventive surveillance. The affected technologies are numerous: from text and multimedia material sent through messaging to emails, videoconferences and chats (also in video games and dating portals), and applies equally to web hosting, social networks, cloud services, etc.
Furthermore, the regulations would affect all services monetized (with subscription and/or advertising model), without any threshold in terms of size, number of users, etc. It would, of course, also affect suppliers from outside the EU.
How to improve INTERNET SECURITY: VPN, DNS and pages with HTTPS
Watchable encryption, the concept
The application of the standard does not even exclude applications with end-to-end encryption. In the words of XNET, “it proposes impossible technology (encrypted and at the same time inspectable technology)”. How do you eat this?
Easy: To preserve the appearance of respecting ‘end-to-end’ encryption, the EU proposes that the analysis and detection of content is carried out on the devices themselves.
That is, it is not only a mockery of the concept of privacy (equivalent to proclaiming respect for the inviolability of the mail while the envelope is closed… as long as you write and open the letters in front of an official)…
…but it is today a technological impossibility: the constant execution of the machine learning algorithms necessary for this task is something that not available for many PCs and mobile devices available on the market. Not, at least, without noticeably degrading its performance.
Useful technology for mass surveillance, not for hunting pedophiles
The regulations state that they will be carried out automatic searches for images of child pornography, both those already known to the authorities (using the hash comparison method) as the unknown ones that could potentially fall into that category (that’s where machine learning algorithms come in).
We have already explained on previous occasions that hash comparison is a useless mechanism in practice (a slight alteration in the source file is enough to alter the hash), and it has also become more than clear that The precision of AI algorithms in this class of tasks is still far (very far) from being functional.
According to the German representative of the Pirate Party, Patrick Breyer,
“the percentage of error [de los algoritmos a usar] is unknown and is not limited by the draft regulations. “Presumably, these technologies result in massive amounts of false reports.”
In fact, the regulations do not indicate that the algorithms will be accessible at any time to the public or the scientific community, nor does it contain any disclosure requirement. For practical purposes, we will have no guarantee that what you are looking for is child pornography and not any other type of content..
Bart Preneel, a cryptography expert at the Catholic University of Leuven, is the author of a technical report for the European Parliament that exposes the flaws of this system. He cites data provided by Thorn, the communications scanning company whose founder, Ashton Kutcher, has previously met with Ylva Johansson, EU Home Affairs Commissioner:
“Thorn says their system gives 10% false positives. Given that Europeans exchange billions of messages daily, that means that tens of millions of people will be charged every day. Even if they are innocent, their information will be processed and stored by law enforcement, and if it is leaked, their reputation will be impacted.”
Supports and criticisms
Johansson has defended the proposal excusing the importance of combating online child abuse. On the opposite side, not only have there been companies, such as Apple, that have expressed their (somewhat hypocritical) concern about the threat to user privacy that this proposal represents… but this point of view has also been supported by experts and organizations non profit.
Andy Yen, CEO of Proton, argues that the regulation, beyond its good intentions, could end up compromising the fundamental right to privacy. In fact, an EU legal report supports these fears, suggesting that the regulation could result in “de facto surveillance” of communicationstheoretically illegal in the EU.
Matthias Pfau, CEO of Tutanota (like Proton, a secure email service), highlighted a few days ago that the rule “is illegal under EU law”since, at the beginning of this year, the jurists of the European Parliament concluded a study on the legality of the regulations that
“when weighing the fundamental rights affected by the measures of the proposal, it can be established that this would violate articles 7 and 8 of the Charter of Fundamental Rights in relation to users.”
And for all this, the draft, of which its proponents expected rapid approval, has been ‘bogged down’ for two years given the rejection it arouses from a few governments (but no, the Spanish one is not exactly among them, quite the opposite).
And it is the one that, being a ‘regulation’ (and not a ‘directive’), once approved it would be applied directly in all member countries of the European Union, without having to wait for national transpositions that could qualify the base text.
Image | Marcos Merino through AI
In Xataka Mobile | Europe yields to the United States again with a new agreement on the transfer of user data