emergency measures for children’s protection

EU approves emergency measures for children’s protection

Temporary emergency measures for children’s protection have just been adopted by European Parliament.


Temporary emergency measures for children’s protection were adopted by European Parliament on July 6th. This regulation will allow electronic communication service providers to scan private online messages containing any display of child sex abuse. The European Commission reported that almost 4 million visual media files containing child abuse were reported last year. There were also 1,500 reports of grooming of minors by sexual predators. Over the past 15 years, reports of this kind have increased by 15,000%. 


This new regulation, which is intended to be executed using AI, has raised some questions regarding privacy. 


Electronic communication service providers are being given the green light to voluntarily scan private conversations and flag content which may contain any display of child sex abuse. This scanning procedure will detect content for flagging using AI, under human supervision. They will also be able to utilize anti-grooming technologies once consultations with data protection authorities are complete. These mechanisms have received some pushback due to privacy concerns. Last year, the EDPB published a non-binding opinion which questioned whether these measures would threaten the fundamental right to privacy. 


Critics argue that this law will not prevent child abuse but will rather make it more difficult to detect and potentially expose legitimate communication between adults. 


This controversial legislation drafted in September 2020, at the peak of the global pandemic, which saw a spike in reports of minors being targeted by predators online, enables companies to voluntarily monitor material related to child sexual abuse. However, it does not require companies to take action. Still, several privacy concerns were raised regarding its implementation, particularly around exposing legitimate conversation between adults which may contain nude material, violating their privacy and potentially opening them up to some form of abuse. During the negotiations, changes were made to include the need to inform users of the possibility of scanning their communications, as well as dictating data retention periods and limitations on the execution of this technology. Despite this, the initiative was criticized, citing that automated tools often flag non relevant material in the majority of cases. Concerns were raised about the possible effect this may have on channels for confidential counseling. Ultimately, critics believe that this will not prevent child abuse, but will rather make it harder to discover it, as it would encourage more hidden tactics. 


This new EU law for children’s protection is a temporary solution for dealing with the ongoing problem of child sexual abuse. 


From the start of 2021, the definition of electronic communications has been changed under EU law to include messaging services. As a result private messaging, which was previously regulated by the GDPR, is now regulated by the ePrivacy directive. Unlike the GDPR, the ePrivacy directive did not include measures to detect child sexual abuse. As a result, voluntary reporting by online providers fell dramatically with the aforementioned change. Negotiations have stalled for several years on revising the ePrivacy directive to include protection against child sexual abuse. This new EU law for children’s protection is but a temporary measure, intended to last until December 2025, or until the revised ePrivacy directive enters into force. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Colorado Privacy Act written into law

Colorado Privacy Act has been written into law, making Colorado the third US state with comprehensive privacy laws. 


The Colorado Privacy Act has recently been signed into law, giving comprehensive privacy laws to the residents of Colorado for the first time. Colorado is now the third US State to enact such laws, with theirs being very similar to those which came before it, with a few key differences. Unlike the California Consumer Privacy Act (CCPA), the CPA has adopted a WPA-like controller / processor approach, instead of a business / service provider perspective. This new law is said to look very familiar to this year’s Consumer Data Protection Act (CDPA) in Virginia, with a slightly broader scope. 


The Colorado Privacy Act is intended to apply to businesses trading with Colorado residents acting only in an individual or household context. 


The CPA applies to any data controller that conducts business in Colorado, as well as delivers commercial products targeted at the residents of Colorado, that meets the following requirements:


  • The business controls or processes personal data of at least 100,000 consumers during a single calendar year.
  • The business derives revenue or receives a discount from the sale of personal data, and processes all controls the personal data for at least 25,000 consumers.


According to the CPA, “consumer” refers to a Colorado resident, acting only as an individual or in a household context. This omits individuals acting in a commercial or employment context or a beneficiary thereof, or as a job applicant. Like the CDPA controllers, operating under the CPA do not need to consider employee personal data as applicable under this law.

The CPA applies to the exchange of personal data for monetary or other valuable consideration by a controller to a third party. 


Under the CPA, both monetary consideration and any other valuable consideration exchanged for personal data is considered the sale of personal information. Unlike the CDPA, the sale is not only defined by the exchange of monetary considerations. The sale described here excludes several types of disclosures. These include disclosures to a processor that is processing personal data on behalf of a data controller, disclosures to a third party for the purpose of providing a product or service requested by a customer, disclosures to an affiliate of the controller’s, as well as disclosures to a third party as part of a proposed or actual merger, acquisition, bankruptcy or another transaction in which the third party controls some or all of the controller’s assets. 

Deidentified data and publicly available information are not covered by the scope of the CPA’s definition of personal data. 


The CPA does not cover any publicly available information or deidentified data. The CPA defines publicly available data as “any information that is lawfully made available from … government records and information that a controller has a reasonable basis to believe the consumer has lawfully made available to the general public.” These are both explicitly excluded from the CPA as is the case with the CDPA. Other exempt data under this law falls under two categories, entity-level exemptions and data-level exemptions. The entity level exemptions are broader and exempt controllers from the need to comply with CPA obligations and rights on data collected, even when the data would otherwise be included. For example the primary entity level exemption under the CPA applies to entities which are already regulated by the Gramm-Leach-Blilet Act for financial institutions. 


The Colorado Privacy Act provides five main rights to the consumer. 

The CPA provides five main rights for the consumer. These include the right of access, right to correction, right to delete, right to data portability, and the right to opt out. The right of access gives consumers the right to confirm whether a controller is processing personal data concerning them and the right of access to that personal data. Under the CPA consumers are also given the right to correct inaccuracies in their personal data, taking into account the nature of the personal data and the purpose of the processing. Consumers also have the right to delete their  personal data. According to the right to data portability, consumers must be able to obtain their personal data in a portable and readily usable format which allows them to transmit the data to another entity without hindrance, where technically feasible. The CPA also gives consumers the right to opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling for decision-making that may produce legal or similarly significant effects concerning them.


There are several obligations to be fulfilled by controllers and processors under the CPA. 


The CPA imposes several obligations on controllers. These include the duties of transparency, purpose specification, data minimization, care, avoidance of secondary use, avoidance of unlawful discrimination, data protection assessments, data processing contracts, and specific duties regarding sensitive data. The CPA requires a controller to provide consumers with a reasonably accessible, clear and meaningful privacy notice. If their data is sold to a third-party or processed for targeted advertising, the controller will have to clearly and conspicuously disclose the sale of processing as well as give consumers the means to opt out. Controllers must specify the express purposes for which they are collecting and processing personal data at the time of the collection of this personal data. The CPA also institutes a policy of data minimization requiring controllers to only collect personal data that is adequate, relevant and limited to what is reasonably necessary for the specified purposes of the collection and processing. In addition, Data controllers are not allowed to process personal data for purposes that are not reasonably necessary to, or compatible with the specified purposes for which it was collected, neither are controllers allowed to process sensitive data without consent. Data protection assessments and contracts are a necessary part of a controller’s obligations under the CPA. The CPA requires that processing must be governed by a contract between the controller and the processor.


Does your company have all of the mandated safeguards in place to ensure compliance with the CCPA, CPA, GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides GDPR ,Data Protection Act 2018 and comparative law consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.


UNESCO Recommendation on AI ethics

UNESCO Recommendation on AI ethics

UNESCO Recommendation on AI ethics has been agreed upon by Member States. 


The Member States of the United Nations Educational, Scientific and Cultural Organization (UNESCO) have agreed on the draft text of a recommendation on the ethics of artificial intelligence (AI). Representatives of the Intergovernmental special committee of technical and legal experts met in April and June this year to examine the draft text of the Recommendation on the Ethics of Artificial Intelligence (AI). The initial draft of this document was shared with Representatives from Member States in September 2020. The committee then met in April and June to compose the draft Recommendation, which is well on its way to becoming the first of its kind- a global framework for AI ethics. The final draft will be submitted to Member States for the adoption by the General Conference of UNESCO 41st session in November. 


A Recommendation was composed addressing ethical issues in AI as far as they are mandated by UNESCO. The approach taken with Artificial Intelligence is from a holistic, comprehensive, multicultural and evolving perspective. The aim is to aid in dealing with AI technologies responsibly, including both the known and unknown aspects of the technology. The document defines AI as “information-processing technologies that integrate models and algorithms that produce a capacity to learn and to perform cognitive tasks leading to outcomes such as prediction and decision-making in material and virtual environments.” The Recommendation focuses on the broader ethical implications of AI systems relating to the general arena of UNESCO. This would include education, science, culture, and communication and information. 


The UNESCO Recommendation for AI ethics aims to provide a framework for guiding the various stages of the AI system life cycle to promote human rights. 


The aim of the UNESCO Recommendation on the Ethics of Artificial Intelligence is to influence the actions of individuals, groups, communities, private companies and institutions to ensure that ethics rest at the heart of AI. The Recommendation seeks to foster dialogue and consensus building dialogue on issues relating to AI ethics on a multidisciplinary, and multi-stakeholder level. This universal framework of values, principles and actions is aimed at protecting, promoting and respecting human rights and freedoms, equality, human dignity, gender equality, cultural diversity and preserving the environment, during each stage of the AI system life cycle. The Recommendation is packed with values and principles to be upheld by all actors in the field of AI, throughout the life cycle of the AI technology. Values play a major role as motivating ideals in the shipping of policy, legal norms and actions. These values are based on the recognition that the trustworthiness and integrity of the lifecycle of AI systems is essential and ensuring that these technologies will work for the good of all humanity.


The Recommendation includes a compilation of values considered to be apt at accomplishing the ethical standard for AI set out by UNESCO. 


The UNESCO Recommendation for AI ethics is built on the values of respect, protection and promotion of human rights and fundamental freedoms based on the inherent dignity of every human , regardless of race, color, descent, age, gender, language, economic or social condition of birth, disability or any other grounds. It also states that environmental and ecosystem flourishing should be recognized, promoted and protected throughout the entire lifecycle of AI systems. The Recommendation aims to ensure diversity and inclusiveness, living in peaceful, just and interconnected societies.


The document also outlines several principles under which AI technologies should operate, at every stage in their life cycle. 


Similar to the guidelines for trustworthy AI by the High-Level Expert Group on Artificial Intelligence (AI HLEG), UNESCO has outlined principles under which trustworthy AI should function. Principles like proportionality, safety, security, sustainability and the right to privacy and data protection are all explained in depth, guiding how AI should function worldwide. The Recommendation states clearly that it must always be possible to attribute legal and ethical responsibility to any stage of the AI system life cycle. The principles of transparency, explainability, responsibility, accountability, human oversight and determination are at the core of this Recommendation with specific mention of multi-stakeholder and adaptive governance and collaboration. Adaptive governance and collaboration ensures that States comply with international law while regulating the data passing through their territories. 


The UNESCO Recommendation for AI ethics goes on to guide policy areas to operationalize the values and principles that it sets out. 


The Recommendation encourages Member States to establish effective measures to ensure that other stakeholders uphold the values and principles of ethical AI technology. UNESCO, recognizing that various Member States will be at various stages of readiness to implement this recommendation, will develop a readiness assessment methodology to help Member States identify their status. The Recommendation suggests that all Member States should introduce frameworks for impact assessments to identify and assess benefits, concerns and risks of AI systems. In addition, Member States should ensure that AI governance procedures are inclusive, transparent, multidisciplinary, multilateral and multi-stakeholder. Governance should ensure that any harms caused through AI systems are investigated and redressed, through the enactment of strong enforcement mechanisms and remedial actions. Data policy, as well as international cooperation are also extremely important to this global framework. The Recommendation suggests how Member States should assess the direct and indirect impact of AI systems, at every point in their life cycle, on the environment and ecosystems. It also upholds that policies surrounding digital technologies and AI should contribute to fully achieving gender equality, the preservation of culture, education and research, as well the improvement of access to information and knowledge. The Recommendation also makes specific mention of how Member States should ensure the preservation of health and well-being through ethical AI use and practices.


“The proposal for a Regulation laying down harmonised rules on AI published by the European Commission on April 2021 and this Recommendation on AI ethics elaborated by the UNESCO show the institutions’ interest on regulating AI, both at European and global levels. Businesses using AI now have an opportunity to adapt their systems and practices in order to be ready before the framework becomes mandatory”, points out Cristina Contero Almagro, Aphaia’s Partner. 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact Assessments, AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Fine imposed for unsecured website

Fine imposed for unsecured website

Fine imposed for unsecured website for registration of new orthodontic patients. 


Patient personal data was found to be at risk, including citizen service numbers, when an orthodontic practice allowed new patients to sign up via an unsecured website. According to this report, several fields of mandatory personal information were captured on an unsecured connection. This could have resulted in a data breach, which could have led to fraud, with several individuals affected, including minors. The Dutch DPA has imposed a fine of €12,000 on an orthopedic practitioner. 

Sensitive personal data was at risk of being accessed by unauthorized parties. 


An unsecured connection was used to capture mandatory personal information from new patients signing up for orthodontic services. 


The unsecured website being used to capture information from new patients included a form, requiring the input of personal data into mandatory fields. The required information included patients’ parents’ information, their general practitioner, insurance information as well as their dentist and citizen service number. This information was sent over an unencrypted connection, making it unsecured. Individuals submitting their personal information while signing up on the website of an orthodontic practitioner are trusting that their sensitive data will be protected. In addition, the majority of orthodontic patients are children and young adults, so this case involved the personal data of several children. Data protection laws have specific safeguards for the sensitive data of children, who are considered a particularly vulnerable group. 


Fine imposed for unsecured website after a complaint was lodged about a privacy violation. 


A complaint was lodged with the Dutch DPA regarding a privacy violation. Because the complaint was regarding poor security within the health sector, a sector with particularly strict privacy requirements, this complaint was taken very seriously by the DPA. Monique Verdier, the DPA’s deputy chair commented on the situation stating “When you register with an orthodontist, you entrust your personal data to them. This is data that the practice needs, but it is also of interest to criminals. Taking good care of your patients includes taking good care of their personal data. This applies to all care providers, not just large institutions.” It is a business’ responsibility to ensure that its website is GDPR compliant, and to secure customer data and websites, preventing possible data breaches, phishing, and other forms of malicious online activity. A fine of €12,000 was imposed on the orthodontic practitioner for this infraction. 


An objection to this fine was lodged, which the DPA declared unfounded. 

The fine imposed on the orthodontic practitioner is not final, and was challenged by the provider. While the fine may be revocable, the DPA has called the objection by the practitioner unfounded. An application for judicial review can be submitted to the district court to have the €12,000 fine revoked. If this is done, the final decision will rest in the hands of the district court. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.