Temporary emergency measures for children’s protection have just been adopted by European Parliament.
Temporary emergency measures for children’s protection were adopted by European Parliament on July 6th. This regulation will allow electronic communication service providers to scan private online messages containing any display of child sex abuse. The European Commission reported that almost 4 million visual media files containing child abuse were reported last year. There were also 1,500 reports of grooming of minors by sexual predators. Over the past 15 years, reports of this kind have increased by 15,000%.
This new regulation, which is intended to be executed using AI, has raised some questions regarding privacy.
Electronic communication service providers are being given the green light to voluntarily scan private conversations and flag content which may contain any display of child sex abuse. This scanning procedure will detect content for flagging using AI, under human supervision. They will also be able to utilize anti-grooming technologies once consultations with data protection authorities are complete. These mechanisms have received some pushback due to privacy concerns. Last year, the EDPB published a non-binding opinion which questioned whether these measures would threaten the fundamental right to privacy.
Critics argue that this law will not prevent child abuse but will rather make it more difficult to detect and potentially expose legitimate communication between adults.
This controversial legislation drafted in September 2020, at the peak of the global pandemic, which saw a spike in reports of minors being targeted by predators online, enables companies to voluntarily monitor material related to child sexual abuse. However, it does not require companies to take action. Still, several privacy concerns were raised regarding its implementation, particularly around exposing legitimate conversation between adults which may contain nude material, violating their privacy and potentially opening them up to some form of abuse. During the negotiations, changes were made to include the need to inform users of the possibility of scanning their communications, as well as dictating data retention periods and limitations on the execution of this technology. Despite this, the initiative was criticized, citing that automated tools often flag non relevant material in the majority of cases. Concerns were raised about the possible effect this may have on channels for confidential counseling. Ultimately, critics believe that this will not prevent child abuse, but will rather make it harder to discover it, as it would encourage more hidden tactics.
This new EU law for children’s protection is a temporary solution for dealing with the ongoing problem of child sexual abuse.
From the start of 2021, the definition of electronic communications has been changed under EU law to include messaging services. As a result private messaging, which was previously regulated by the GDPR, is now regulated by the ePrivacy directive. Unlike the GDPR, the ePrivacy directive did not include measures to detect child sexual abuse. As a result, voluntary reporting by online providers fell dramatically with the aforementioned change. Negotiations have stalled for several years on revising the ePrivacy directive to include protection against child sexual abuse. This new EU law for children’s protection is but a temporary measure, intended to last until December 2025, or until the revised ePrivacy directive enters into force.