emergency measures for children’s protection

EU approves emergency measures for children’s protection

Temporary emergency measures for children’s protection have just been adopted by European Parliament.


Temporary emergency measures for children’s protection were adopted by European Parliament on July 6th. This regulation will allow electronic communication service providers to scan private online messages containing any display of child sex abuse. The European Commission reported that almost 4 million visual media files containing child abuse were reported last year. There were also 1,500 reports of grooming of minors by sexual predators. Over the past 15 years, reports of this kind have increased by 15,000%. 


This new regulation, which is intended to be executed using AI, has raised some questions regarding privacy. 


Electronic communication service providers are being given the green light to voluntarily scan private conversations and flag content which may contain any display of child sex abuse. This scanning procedure will detect content for flagging using AI, under human supervision. They will also be able to utilize anti-grooming technologies once consultations with data protection authorities are complete. These mechanisms have received some pushback due to privacy concerns. Last year, the EDPB published a non-binding opinion which questioned whether these measures would threaten the fundamental right to privacy. 


Critics argue that this law will not prevent child abuse but will rather make it more difficult to detect and potentially expose legitimate communication between adults. 


This controversial legislation drafted in September 2020, at the peak of the global pandemic, which saw a spike in reports of minors being targeted by predators online, enables companies to voluntarily monitor material related to child sexual abuse. However, it does not require companies to take action. Still, several privacy concerns were raised regarding its implementation, particularly around exposing legitimate conversation between adults which may contain nude material, violating their privacy and potentially opening them up to some form of abuse. During the negotiations, changes were made to include the need to inform users of the possibility of scanning their communications, as well as dictating data retention periods and limitations on the execution of this technology. Despite this, the initiative was criticized, citing that automated tools often flag non relevant material in the majority of cases. Concerns were raised about the possible effect this may have on channels for confidential counseling. Ultimately, critics believe that this will not prevent child abuse, but will rather make it harder to discover it, as it would encourage more hidden tactics. 


This new EU law for children’s protection is a temporary solution for dealing with the ongoing problem of child sexual abuse. 


From the start of 2021, the definition of electronic communications has been changed under EU law to include messaging services. As a result private messaging, which was previously regulated by the GDPR, is now regulated by the ePrivacy directive. Unlike the GDPR, the ePrivacy directive did not include measures to detect child sexual abuse. As a result, voluntary reporting by online providers fell dramatically with the aforementioned change. Negotiations have stalled for several years on revising the ePrivacy directive to include protection against child sexual abuse. This new EU law for children’s protection is but a temporary measure, intended to last until December 2025, or until the revised ePrivacy directive enters into force. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Adequacy decisions adopted

Adequacy decisions adopted for EU-UK data transfers

Adequacy decisions adopted by the European Union for the UK regarding data transfers.


The European Commission has recently adopted adequacy decisions for the United Kingdom. Since Brexit there has been some question as to the UK’s adequacy, or rather the level of protection afforded to data transfers between the EU and the UK. With the adoption of these adequacy decisions- one under the General Data Protection Regulation or GDPR, and the other for the Law Enforcement Directive, data transfers can now freely flow between the European Union and the United Kingdom. This data will be considered as having the equivalent level of protection that is guaranteed under EU law when being transferred to the UK.


The adequacy decisions adopted came after a thorough assessment process, during which data transfers occurred based on a Trade and Cooperation agreement. 


Since the draft adequacy decisions for the UK were published in February, the UK’s practices and laws regarding personal data protection have been carefully assessed. In April, the EDPB gave its opinion on UK adequacy, which was then followed by a comitology procedure which included a vote from EU Member States. In the absence of an adequacy decision, and while in the process of establishing one, data transfers flowed between the EU and the UK, based on a Trade and Cooperation agreement. This agreement expired on June 30, 2021, and provided that, in the absence of an adequacy decision, all data transfers carried out in the context of its implementation would comply with the GDPR and Law Enforcement Directive. 


UK data protection laws still very much resemble the laws under which the country operated as an EU Member State.


The UK, as a former EU Member State, had a data protection system which was still based on the very same rules under which UK data protection functioned while the UK was still an EU Member State. The principles, rights and obligations of the GDPR and Law Enforcement Directive have been fully incorporated into UK law. This has made, not only the Trade and Cooperation agreement, but also the adequacy decisions easier and more feasible.  The UK provides strong safeguards regarding access to personal data by public authorities. In principle, The collection of data by intelligence authorities is subject to prior authorization by an independent judicial body. 


The adequacy decisions include a sunset clause which causes them to expire after four years.


These adequacy decisions include a ‘sunset clause’. This is the first of its kind and strictly limits the duration of the validity of these adequacy decisions. What this means is that these decisions will automatically expire in four years, after which adequacy findings may be renewed. However, this is subject to the UK continuing to ensure an adequate level of data protection. The European Commission will continue to monitor the legal situation in the UK and at any point, reserves the right to intervene if the UK deviates from the current level of data protection provided. After the four year duration of these recently adopted adequacy decisions, if the European Commission decides to renew the adequacy decisions, the adoption process would start over.


GDPR adequacy related to immigration control has been excluded from this decision, to be reassessed pending judgments from the England and Wales Court of Appeal.


Due to a recent judgment of the England and Wales Court of Appeal, data transfers for the purposes of UK immigration control have been excluded from the scope of the GDPR adequacy decision. The judgment affects the validity and interpretation of certain data protection rights related to immigration and control and therefore the Commision, once this matter has been dealt with under UK law, will reassess the necessity of this exclusion. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Call for a ban on facial recognition

Call for a ban on facial recognition: EDPB and EDPS release a joint statement

The EDPB and EDPS have made a collaborative call for a ban on facial recognition for automated recognition in public spaces. 


 The EDPB and EDPS call for a ban on the use of AI for biometric identification in publicly accessible spaces. This includes facial recognition, fingerprints, DNA, voice recognition and other biometric or behavioral signals. This call comes after the European Commission outlined harmonized rules for artificial intelligence earlier this year. While the EDPB and EDPS embrace the introduction of rules addressing the use of AI systems in the EU, by institutions, bodies or agencies, the organizations have expressed concern over the exclusion, from the proposal, of cooperation from international law enforcement. The EDPB and EDPS also stress that it is necessary to clarify that the existing data protection regulation within the EU applies to any and all personal data processing under the scope of the draft AI regulation. 


The EDPB and EDPS call for a general ban on the use of AI in public spaces, particularly in ways which might lead to discrimination. 


In a recently released joint statement, the EDPB & EDPS recognize that extremely high risks are posed by remote biometric identification of individuals in public spaces, particularly the use of AI systems using biometrics to categorize individuals based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited. According to Article 21 of the Charter of Fundamental Rights, “Any discrimination based on any ground such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation shall be prohibited.” In addition the organizations are calling for a prohibition on the use of AI to deduce the emotional state of natural persons except in specific cases. One example of this in the field of health includes cases where patient emotion recognition is relevant and important. However the EDPB and EDPS maintain that any use of this sort of AI for any type of social classification or scoring should be strictly prohibited. “One should keep in mind that ubiquitous facial recognition in public spaces makes it difficult to inform the data subject about what is happening, which also makes it all but impossible to object to processing, including profiling” comments Dr Bostjan Makarovic, Aphaia’s Managing Partner


The EDPB and EDPS call for greater clarity on the role of the EDPS as competent and market surveillance authority. 


The organizations in their joint opinion, embrace the fact that the European Commission proposal designates the EDPS as the market surveillance authority and competent authority for the supervision of institutions, agencies and bodies within the European Union. However the organisations are also calling for further clarification on the specific tasks of the EDPS within that role. The EDPB and EDPS acknowledge that data protection authorities are already enforcing the GDPR and LED in the context of AI involving personal data. However the organizations are suggesting a more harmonized regulatory approach, involving the DPAs as designated national supervisory authorities, as well as  consistent interpretation of data processing provisions across the EU. In addition, the statement calls for greater autonomy to be given to the European Artificial Intelligence Board, in order to avoid conflict and create an atmosphere for an  AI European body free from political influence. 


Do you want to learn more about facial recognition in public spaces? Check our vlog.

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides EU AI Ethics Assessments, Data Protection Officer outsourcing and ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments. We can help your company get on track towards full compliance.

New SCCs adopted

New SCCs adopted for international data transfers

New SCCs adopted by the European Commission last week introduce more legal and privacy safeguards for data transfers. 


Since the CJEU‘s Schrems II decision last July, affecting transfers outside the EU via Standard Contractual Clauses, SCC’s have been the topic of much discussion regarding data transfers. These SCCs have been used by numerous companies for the transfer of data for several purposes including, but not limited to cloud storage, hosting, finance and marketing. The announcement was made last Wednesday, that the European Commission would be adopting new Standard Contractual Clauses come Friday, June 4th. Justice Commissioner Didier Reynders said that these new SCCs “incorporated some elements of transparency, accountability in full compliance with the GDPR”, adding that the goal was to avoid a “Schrems III”.


The European Commission has adopted two sets of Standard Contractual Clauses reflecting the new requirements under the GDPR. 


The new SCCs adopted by the European Commission for the transfer of personal data to third countries take into account the details of the Schrems II judgment by the CJEU, and offer more legal predictability to European businesses. The new SCCs are expected to help small to medium enterprises in particular, to ensure compliance with safe data transfer requirements. They will provide companies with a template which is easy to implement, allowing data to move freely across borders, without legal barriers. 


The European Commission has also adopted another set of SCCs for use between controllers and processors within the EU.


The new SCCs are more practical and flexible and cover a broad range of transfer scenarios.


The new Standard Contractual Clauses include an overview of the different steps that companies will have to implement in order to comply with the Schrems II judgment, complete with examples of possible supplementary measures which may be necessary to ensure compliance. These supplementary measures are intended to strengthen protection of data transferred to third countries which are not regarded as having adequate protection. These additional safeguards include encryption and pseudonymized personal data, which would prevent the personal data from being attributed to a specific individual, without the use of additional details. The new SCCs adopted by the European Commission cover a broad range of various transfer scenarios, all in one practical toolbox. 


A transition period of 18 months is provided for processors and controllers that are currently using old SCCs.

Many companies, since the CJEU’s judgment last summer, have been using Standard Contractual Clauses to facilitate their third country personal data transfers. When the EU-US Privacy Shield was invalidated last July, the court confirmed the validity of the EU Standard Contractual Clauses for the transfer of personal data to processors outside the EU. However, this did not come without complications, as in various cases it was found that for data transfers to the US and other third countries, the SCCs did not provide sufficient protection for personal data. These, now old SCCs are currently in use by the majority of companies who transfer data to third countries. The European Commission has now verified that these SCCs can continue to be used for the next 18 months, as companies transition to using the new SCCs adopted last Friday. 


Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcing.  Contact us today.