Facebook and WhatsApp data sharing

Facebook and WhatsApp data sharing requires further investigation, says EDPB

Further investigations are  required by the Irish Supervisory Authority before making a final decision regarding Facebook processing WhatsApp user data. 


The EDPB had adopted an urgent binding decision pursuant to Article 66 of the GDPR, requiring the Irish Supervisory Authority to carry out an investigation, rather than taking final measures, following a recent change in WhatsApp’s Terms of Service and Privacy Policy. The Supervisory Authority has adopted provisional measures towards Facebook Ireland, ordering a ban on the company processing user data from WhatsApp for their own purposes. However, the EDPB believes that further investigations are required to gain clarity on the processing activities in question. 


The EDPB concluded that the situation does not require any final measures as the conditions to demonstrate the existence of an infringement or an urgency have not been met. 


 The conclusion from the EDPB based on the evidence presented was that no final measures needed to be taken by the Supervisory Authority at this time. For one, the EDPB believes that there is a high likelihood that WhatsApp user data is already being processed by Facebook Ireland on the basis of joint controllership. The data is likely being processed in this way for the purpose of safety, security and integrity of all Facebook Companies including WhatsApp. Nonetheless, the EDPB is unable to determine with certainty what processing operations are indeed being carried out and in what capacity they are being carried out. This is due to various uncertainties and ambiguities in information provided to WhatsApp users. That being established, further investigations are required into those conditions before making any final decisions, especially considering the absence of any indication of a clear infringement or a need for urgency in this matter. 


The EDPB says further investigations are required by the Supervisory Authority to determine whether Facebook Ireland acts as a processor or joint controller with WhatsApp Ireland. 


While it is likely that Facebook is operating as a joint controller with respect to the processing of WhatsApp user data, the EDPB considers this to be unclear at this time and would like the Irish Supervisory Authority to further investigate and clarify whether Facebook Ireland is indeed acting as a joint controller or a processor. Currently, there is a lack of sufficient information regarding how data is processed for marketing purposes among the various Facebook Companies. Further investigations are required to also determine whether there is proper legal basis for those processing activities under the GDPR. 


The official binding decision will be published on the EDPB’s website once it has been properly assessed to ensure that any confidential information is redacted. However all relevant Supervisory Authorities, as well as Facebook Ireland and WhatsApp Ireland have been informed of the EDPB’s decision. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Colorado Privacy Act written into law

Colorado Privacy Act has been written into law, making Colorado the third US state with comprehensive privacy laws. 


The Colorado Privacy Act has recently been signed into law, giving comprehensive privacy laws to the residents of Colorado for the first time. Colorado is now the third US State to enact such laws, with theirs being very similar to those which came before it, with a few key differences. Unlike the California Consumer Privacy Act (CCPA), the CPA has adopted a WPA-like controller / processor approach, instead of a business / service provider perspective. This new law is said to look very familiar to this year’s Consumer Data Protection Act (CDPA) in Virginia, with a slightly broader scope. 


The Colorado Privacy Act is intended to apply to businesses trading with Colorado residents acting only in an individual or household context. 


The CPA applies to any data controller that conducts business in Colorado, as well as delivers commercial products targeted at the residents of Colorado, that meets the following requirements:


  • The business controls or processes personal data of at least 100,000 consumers during a single calendar year.
  • The business derives revenue or receives a discount from the sale of personal data, and processes all controls the personal data for at least 25,000 consumers.


According to the CPA, “consumer” refers to a Colorado resident, acting only as an individual or in a household context. This omits individuals acting in a commercial or employment context or a beneficiary thereof, or as a job applicant. Like the CDPA controllers, operating under the CPA do not need to consider employee personal data as applicable under this law.

The CPA applies to the exchange of personal data for monetary or other valuable consideration by a controller to a third party. 


Under the CPA, both monetary consideration and any other valuable consideration exchanged for personal data is considered the sale of personal information. Unlike the CDPA, the sale is not only defined by the exchange of monetary considerations. The sale described here excludes several types of disclosures. These include disclosures to a processor that is processing personal data on behalf of a data controller, disclosures to a third party for the purpose of providing a product or service requested by a customer, disclosures to an affiliate of the controller’s, as well as disclosures to a third party as part of a proposed or actual merger, acquisition, bankruptcy or another transaction in which the third party controls some or all of the controller’s assets. 

Deidentified data and publicly available information are not covered by the scope of the CPA’s definition of personal data. 


The CPA does not cover any publicly available information or deidentified data. The CPA defines publicly available data as “any information that is lawfully made available from … government records and information that a controller has a reasonable basis to believe the consumer has lawfully made available to the general public.” These are both explicitly excluded from the CPA as is the case with the CDPA. Other exempt data under this law falls under two categories, entity-level exemptions and data-level exemptions. The entity level exemptions are broader and exempt controllers from the need to comply with CPA obligations and rights on data collected, even when the data would otherwise be included. For example the primary entity level exemption under the CPA applies to entities which are already regulated by the Gramm-Leach-Blilet Act for financial institutions. 


The Colorado Privacy Act provides five main rights to the consumer. 

The CPA provides five main rights for the consumer. These include the right of access, right to correction, right to delete, right to data portability, and the right to opt out. The right of access gives consumers the right to confirm whether a controller is processing personal data concerning them and the right of access to that personal data. Under the CPA consumers are also given the right to correct inaccuracies in their personal data, taking into account the nature of the personal data and the purpose of the processing. Consumers also have the right to delete their  personal data. According to the right to data portability, consumers must be able to obtain their personal data in a portable and readily usable format which allows them to transmit the data to another entity without hindrance, where technically feasible. The CPA also gives consumers the right to opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling for decision-making that may produce legal or similarly significant effects concerning them.


There are several obligations to be fulfilled by controllers and processors under the CPA. 


The CPA imposes several obligations on controllers. These include the duties of transparency, purpose specification, data minimization, care, avoidance of secondary use, avoidance of unlawful discrimination, data protection assessments, data processing contracts, and specific duties regarding sensitive data. The CPA requires a controller to provide consumers with a reasonably accessible, clear and meaningful privacy notice. If their data is sold to a third-party or processed for targeted advertising, the controller will have to clearly and conspicuously disclose the sale of processing as well as give consumers the means to opt out. Controllers must specify the express purposes for which they are collecting and processing personal data at the time of the collection of this personal data. The CPA also institutes a policy of data minimization requiring controllers to only collect personal data that is adequate, relevant and limited to what is reasonably necessary for the specified purposes of the collection and processing. In addition, Data controllers are not allowed to process personal data for purposes that are not reasonably necessary to, or compatible with the specified purposes for which it was collected, neither are controllers allowed to process sensitive data without consent. Data protection assessments and contracts are a necessary part of a controller’s obligations under the CPA. The CPA requires that processing must be governed by a contract between the controller and the processor.


Does your company have all of the mandated safeguards in place to ensure compliance with the CCPA, CPA, GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides GDPR ,Data Protection Act 2018 and comparative law consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.


UNESCO Recommendation on AI ethics

UNESCO Recommendation on AI ethics

UNESCO Recommendation on AI ethics has been agreed upon by Member States. 


The Member States of the United Nations Educational, Scientific and Cultural Organization (UNESCO) have agreed on the draft text of a recommendation on the ethics of artificial intelligence (AI). Representatives of the Intergovernmental special committee of technical and legal experts met in April and June this year to examine the draft text of the Recommendation on the Ethics of Artificial Intelligence (AI). The initial draft of this document was shared with Representatives from Member States in September 2020. The committee then met in April and June to compose the draft Recommendation, which is well on its way to becoming the first of its kind- a global framework for AI ethics. The final draft will be submitted to Member States for the adoption by the General Conference of UNESCO 41st session in November. 


A Recommendation was composed addressing ethical issues in AI as far as they are mandated by UNESCO. The approach taken with Artificial Intelligence is from a holistic, comprehensive, multicultural and evolving perspective. The aim is to aid in dealing with AI technologies responsibly, including both the known and unknown aspects of the technology. The document defines AI as “information-processing technologies that integrate models and algorithms that produce a capacity to learn and to perform cognitive tasks leading to outcomes such as prediction and decision-making in material and virtual environments.” The Recommendation focuses on the broader ethical implications of AI systems relating to the general arena of UNESCO. This would include education, science, culture, and communication and information. 


The UNESCO Recommendation for AI ethics aims to provide a framework for guiding the various stages of the AI system life cycle to promote human rights. 


The aim of the UNESCO Recommendation on the Ethics of Artificial Intelligence is to influence the actions of individuals, groups, communities, private companies and institutions to ensure that ethics rest at the heart of AI. The Recommendation seeks to foster dialogue and consensus building dialogue on issues relating to AI ethics on a multidisciplinary, and multi-stakeholder level. This universal framework of values, principles and actions is aimed at protecting, promoting and respecting human rights and freedoms, equality, human dignity, gender equality, cultural diversity and preserving the environment, during each stage of the AI system life cycle. The Recommendation is packed with values and principles to be upheld by all actors in the field of AI, throughout the life cycle of the AI technology. Values play a major role as motivating ideals in the shipping of policy, legal norms and actions. These values are based on the recognition that the trustworthiness and integrity of the lifecycle of AI systems is essential and ensuring that these technologies will work for the good of all humanity.


The Recommendation includes a compilation of values considered to be apt at accomplishing the ethical standard for AI set out by UNESCO. 


The UNESCO Recommendation for AI ethics is built on the values of respect, protection and promotion of human rights and fundamental freedoms based on the inherent dignity of every human , regardless of race, color, descent, age, gender, language, economic or social condition of birth, disability or any other grounds. It also states that environmental and ecosystem flourishing should be recognized, promoted and protected throughout the entire lifecycle of AI systems. The Recommendation aims to ensure diversity and inclusiveness, living in peaceful, just and interconnected societies.


The document also outlines several principles under which AI technologies should operate, at every stage in their life cycle. 


Similar to the guidelines for trustworthy AI by the High-Level Expert Group on Artificial Intelligence (AI HLEG), UNESCO has outlined principles under which trustworthy AI should function. Principles like proportionality, safety, security, sustainability and the right to privacy and data protection are all explained in depth, guiding how AI should function worldwide. The Recommendation states clearly that it must always be possible to attribute legal and ethical responsibility to any stage of the AI system life cycle. The principles of transparency, explainability, responsibility, accountability, human oversight and determination are at the core of this Recommendation with specific mention of multi-stakeholder and adaptive governance and collaboration. Adaptive governance and collaboration ensures that States comply with international law while regulating the data passing through their territories. 


The UNESCO Recommendation for AI ethics goes on to guide policy areas to operationalize the values and principles that it sets out. 


The Recommendation encourages Member States to establish effective measures to ensure that other stakeholders uphold the values and principles of ethical AI technology. UNESCO, recognizing that various Member States will be at various stages of readiness to implement this recommendation, will develop a readiness assessment methodology to help Member States identify their status. The Recommendation suggests that all Member States should introduce frameworks for impact assessments to identify and assess benefits, concerns and risks of AI systems. In addition, Member States should ensure that AI governance procedures are inclusive, transparent, multidisciplinary, multilateral and multi-stakeholder. Governance should ensure that any harms caused through AI systems are investigated and redressed, through the enactment of strong enforcement mechanisms and remedial actions. Data policy, as well as international cooperation are also extremely important to this global framework. The Recommendation suggests how Member States should assess the direct and indirect impact of AI systems, at every point in their life cycle, on the environment and ecosystems. It also upholds that policies surrounding digital technologies and AI should contribute to fully achieving gender equality, the preservation of culture, education and research, as well the improvement of access to information and knowledge. The Recommendation also makes specific mention of how Member States should ensure the preservation of health and well-being through ethical AI use and practices.


“The proposal for a Regulation laying down harmonised rules on AI published by the European Commission on April 2021 and this Recommendation on AI ethics elaborated by the UNESCO show the institutions’ interest on regulating AI, both at European and global levels. Businesses using AI now have an opportunity to adapt their systems and practices in order to be ready before the framework becomes mandatory”, points out Cristina Contero Almagro, Aphaia’s Partner. 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact Assessments, AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Adequacy decisions adopted

Adequacy decisions adopted for EU-UK data transfers

Adequacy decisions adopted by the European Union for the UK regarding data transfers.


The European Commission has recently adopted adequacy decisions for the United Kingdom. Since Brexit there has been some question as to the UK’s adequacy, or rather the level of protection afforded to data transfers between the EU and the UK. With the adoption of these adequacy decisions- one under the General Data Protection Regulation or GDPR, and the other for the Law Enforcement Directive, data transfers can now freely flow between the European Union and the United Kingdom. This data will be considered as having the equivalent level of protection that is guaranteed under EU law when being transferred to the UK.


The adequacy decisions adopted came after a thorough assessment process, during which data transfers occurred based on a Trade and Cooperation agreement. 


Since the draft adequacy decisions for the UK were published in February, the UK’s practices and laws regarding personal data protection have been carefully assessed. In April, the EDPB gave its opinion on UK adequacy, which was then followed by a comitology procedure which included a vote from EU Member States. In the absence of an adequacy decision, and while in the process of establishing one, data transfers flowed between the EU and the UK, based on a Trade and Cooperation agreement. This agreement expired on June 30, 2021, and provided that, in the absence of an adequacy decision, all data transfers carried out in the context of its implementation would comply with the GDPR and Law Enforcement Directive. 


UK data protection laws still very much resemble the laws under which the country operated as an EU Member State.


The UK, as a former EU Member State, had a data protection system which was still based on the very same rules under which UK data protection functioned while the UK was still an EU Member State. The principles, rights and obligations of the GDPR and Law Enforcement Directive have been fully incorporated into UK law. This has made, not only the Trade and Cooperation agreement, but also the adequacy decisions easier and more feasible.  The UK provides strong safeguards regarding access to personal data by public authorities. In principle, The collection of data by intelligence authorities is subject to prior authorization by an independent judicial body. 


The adequacy decisions include a sunset clause which causes them to expire after four years.


These adequacy decisions include a ‘sunset clause’. This is the first of its kind and strictly limits the duration of the validity of these adequacy decisions. What this means is that these decisions will automatically expire in four years, after which adequacy findings may be renewed. However, this is subject to the UK continuing to ensure an adequate level of data protection. The European Commission will continue to monitor the legal situation in the UK and at any point, reserves the right to intervene if the UK deviates from the current level of data protection provided. After the four year duration of these recently adopted adequacy decisions, if the European Commission decides to renew the adequacy decisions, the adoption process would start over.


GDPR adequacy related to immigration control has been excluded from this decision, to be reassessed pending judgments from the England and Wales Court of Appeal.


Due to a recent judgment of the England and Wales Court of Appeal, data transfers for the purposes of UK immigration control have been excluded from the scope of the GDPR adequacy decision. The judgment affects the validity and interpretation of certain data protection rights related to immigration and control and therefore the Commision, once this matter has been dealt with under UK law, will reassess the necessity of this exclusion. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.