EU’s proposed mechanism to combat child pornography with AI and experts’ concerns about mass surveillance risks

3 mins read
May 6, 2024

To combat and detect child pornography content, the European Union wants to be more active by using artificial intelligence systems, which makes a group of cyber security experts warn of mass surveillance risks.

camera filming a blurred image
The Child Sexual Abuse Regulation (CSAR) was implemented by the EU in 2021. | © Zeke See

Nearly 200 IT security scientists from 26 countries are concerned about the extension of a European derogation designed to combat the spread of child pornography by requiring e‑mail services to install detection software.

The Child Sexual Abuse Regulation (CSAR), extended last Tuesday, introduces a complex technical architecture called “client-side scanning” to combat the proliferation of child pornography on the Internet.

The EU derogation, first adopted in 2021 and now extended until April 3, 2026, provides a long-term legal framework for the detection of online child sexual activity until a new law, currently on the table of the Council and the European Parliament, is passed.

This approach, which has still not been implemented in practice, relies on artificial intelligence systems to detect images, videos, and speech containing sexual abuse of minors, as well as attempts to manipulate children.

96% of content removed by YouTube is flagged by automated detection technologies

All digital platforms likely to be used for malicious purposes — from Facebook to Telegram, Snapchat, or TikTok, not forgetting online gaming sites — should use this technology to detect and report traces of child pornography material on their systems and in users’ private chats.

The EU justifies this measure by explaining that proactive detection of child sexual abuse is essential to prevent its spread, as public reporting will never be sufficient. In fact, almost 96% of content removed by YouTube is reported by automated detection technologies, and in most cases, this happens before the video reaches 10 views.

Preventive measures, such as digital literacy and risk assessment and alleviation, are essential to creating a safe digital environment for children, but will not be enough to stop the proliferation of online child sexual abuse. Child Sexual Abuse Regulation (CSAR) and detection will enable law enforcement to save children and arrest offenders every day across Europe,” reports Eurochild, an EU-funded child protection working group.

Although they believe that the sexual abuse and exploitation of children are serious crimes and that it is essential that governments, service providers, and society as a whole assume a major responsibility in the fight against them, scientists, in an open letter to the European Commission, state that technically “this new proposal completely compromises the security of communications and systems.

This derogation “creates unprecedented capabilities for monitoring and controlling Internet users. This compromises the security of our society’s digital future and could have enormous consequences for democratic processes in Europe and beyond,” they say.

Many false alarms

Detection software can be a good method of tracking down users of previously identified child pornography, says Jaap-Henk Hoepman, a computer researcher at Karstadt University and co-author of the letter, interviewed by Dutch media NOS. “But the detection software needs to be installed in a way that recognizes new, unknown content, and this will require the use of artificial intelligence, a technology that is not yet sufficiently developed,” he points out. “We’ll get a lot of false alarms.

Ben van Mierlo, the national child pornography coordinator for the Dutch police, also interviewed by NOS, shares the same concerns: “A grandfather who sends a photo of his grandson in the swimming pool to the family group will be detected by the software, which will send a report to Interpol or Europol. And the sender or recipient will be considered suspicious when in fact they are not.” This will also require a lot of manpower. “Billions of messages will have to be examined,” he adds.

Van Mierlo points out that the detection software will only work if the police also have access to the messages. “If the recipient of suspicious content simply receives a notification that this message cannot be seen, the spread will be stopped. But we need evidence to be able to arrest a suspect,” he said.

“Infringement of fundamental privacy protections”

He also called for the responsibility of the companies behind the messaging services. “Companies like Meta need to take responsibility for alerting investigating authorities and passing on personal data in the context of criminal activity. They say it’s not technically possible, but we doubt it,” he explains.

The police have been trying to access encrypted messages for almost as long as they’ve existed, something privacy experts, including Hoepman, vehemently protest: “Privacy is a great asset. The police can’t see encrypted correspondence, but they have a lot of metadata: who communicates with whom and when, they have the location data of ten million Dutch people. They can analyze all this.

Last February, Apple warned against a similar Australian proposal to force tech companies to analyze cloud and messaging services for child pornography. According to the multinational, this process risks “undermining fundamental privacy and security protections and may lead to mass surveillance with global repercussions.

Julie Carballo

Julie Carballo is a journalist for Newsendip.

She used to work for the French newspaper Le Figaro and at the Italian bureau of the international press agency AFP.