Spanish DPA AEPD

Spanish DPA AEPD publishes Guidelines on AI audits

AEPD, the Spanish data protection authority, has published Guidelines on the requirements that should be implemented for conducting audits of data processing activities that embed AI.

Early this month, the Spanish DPA, AEPD, published Guidelines on the requirements that should be considered when undertaking audits of personal data processing activities which involve AI elements. The document addresses the special controls to which the audits of personal data processing activities comprising AI components should be subject.

Audits are part of the technical and security measures regulated in the GDPR and they are deemed essential for a proper protection of personal data. The AEPD Guidelines contain a list of audit controls among which the auditor can select the most suitable ones, on a case by case basis, depending on several factors such as the way the processing may affect GDPR compliance, the type of AI component used, type of data processing and the risks to the rights and freedoms of the data subjects that the processing activities pose.

Special features of AI audits methodology

The AEPD remarks that the audit process should be governed by the principles laid down in the GDPR, namely: lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality and accountability.

The AEPD also points out that all the controls listed in the Guidelines are not meant to be applied together. The auditor should select those ones that are relevant based on the scope of the audit and the goals it pursues.

What type of data processing do these requirements apply to and who should comply with them?

The Guidelines will be applicable where:

  • There are personal data processing activities at any stage of the AI component lifecycle; or
  • The data processing activities aim to profile individuals or make automated decisions which produce legal effects concerning the data subjects or similarly significantly affects them.

The AEPD states that in some cases it might be useful to carry out some preliminary assessments before moving forward with the audit, such as, inter-alia, an assessment of the level of anonymisation of personal data, an assessment of the risk of re-identification and an assessment of the risk of losing data stored in the cloud.

The document is especially addressed to data controllers who audit personal data processing activities that include components based on AI, to data processors and developers who wish to offer additional guarantees around their products and services, to DPOs responsible for monitoring the data processing and providing advice to the data controllers and to auditors who work with this type of processing.

Control goals and actual controls

The main body of the Guidelines consists of five audit areas that are broken down into several objectives containing the actual controls among which the auditors, or the person in charge of the process as relevant, can make their selection for the specific audit they are undertaking.

The AEPD provides an exhaustive list comprising more than a hundred of controls, which are summed up in the following paragraphs. 

  • AI component identification and transparency

This area includes the following objectives: inventory of the AI components, definition of responsibilities, and transparency.

The AEPD stresses the importance of keeping full records both of the components, -including, inter alia, ID, version, date of creation and previous versions- and the persons in charge of the process -such as their contact details, roles and responsibilities-. There are also some provisions with regard to the information that should be available to the stakeholders, especially when it comes to the data sources, the data categories involved, the model and the logic behind the AI component, and the accountability mechanisms.

  • AI component purpose

There are several objectives within this area: identification of the AI component purposes, uses and context, proportionality and necessity assessment, data recipients, data storage limitation and analysis of the data subject categories.

The controls linked to these objectives are based on the standards and requirements needed to achieve the desired outcomes and the elements that may affect said result, as for example the conditioning factors, the socioeconomic conditions, and the allocation of tasks, among others, for which a risk assessment and a DPIA are recommended.

  • AI component basis

This area is built over the following objectives: identification of the AI component development process and basic architecture, DPO involvement, adequacy of the theoretical models and methodological framework.

The controls defined in this section are mainly related to the formal elements of the process and the methodology followed. They aim to ensure the interoperability between the AI component development process and the privacy policy, to define the requirements that the DPO should meet and guarantee their proper involvement in a timely manner and to set out the relevant revision procedures.

  • Data management

The AEPD details four objectives in this area: data quality, identification of the origin of the data sources, personal data preparation and bias control. 

Whereas data protection is the ‘leitmotiv’ along the Guidelines, it is specially present in this chapter, which covers, inter alia, data governance, variables and proportionality distribution, lawful basis for processing, reasoning behind the selection of data sources and data and variables categorisation.

  • Verification and validation

Seven objectives are pursued in this area: verification and validation of the AI component, adequacy of the verification and validation process, performance, coherence, robustness, traceability and security. 

The controls set out in this area focus on ensuring data protection compliance for the ongoing implementation and use of the AI component, looking for guarantees around the existence of a standard which allows for verification and validation procedures once the AI component has been integrated, a schedule for internal inspections, an analysis of false positives and false negatives, a procedure to find anomalies and mechanisms for identifying unexpected behaviour, among others.

Final remarks

The AEPD concludes with a reminder of the fact that the Guidelines contain a data protection approach to the audit of AI components, which means, on the one hand, that it may need to be combined with additional controls derived from other perspectives and, on the other hand, that not all controls will be relevant in each case, as they should be selected according to the specific needs, considering the type of processing, the client’s requirements, and the specific features of the audit and its scope, together with the results of the risk assessment.

Does your company use AI? You may be affected by EU future regulatory framework. We can help you. Aphaia provides both GDPR and DPA 2018 adaptation consultancy services, including data protection impact assessmentsEU AI Ethics assessments and Data Protection Officer outsourcingContact us today.

Doctors fined by CNIL

Doctors fined by CNIL: The French DPA has sanctioned two health professionals over poor data protection.

Two doctors have been fined by CNIL for having insufficient data protection, and neglecting to notify of a recent data breach. 

 

Last month, in France, CNIL announced that two doctors were found to be in breach of articles 32 and 33 of the GDPR. Following a September 2019 online check, the two doctors had thousands of images hosted on their servers, freely available online. Upon investigation, the doctors were concluded to have poorly configured their internet box, as well as their medical imaging software, leading to the data breach. The doctors were charged €3,000 and €6,000 respectively, and while the CNIL thought it unnecessary to publish the names of the doctors in question, they expressed the importance of the publicity of these decisions in an effort to alert health professionals to their obligations and the need to strengthen their vigilance on security measures.

 

The doctors fined by the CNIL, failed to adequately protect data thereby breaching article 32 of the GDPR. 

 

According to article 32 of the GDPR, data controllers and processors are responsible for implementing appropriate technical and organisational measures to ensure a level of security appropriate to the risk in order to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services. A data protection impact assessment would have notified the doctors in advance of the faults in the configuring which led to the data breach. 

 

Article 32 of the GDPR states “In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.”

 

By not adequately notifying the CNIL of the data breaches, the two doctors breached article 33 of the GDPR as well.

According to article 33 of the GDPR controllers need to make a notification of any data breaches without undue delay, and where possible, within 72 hours of realizing that data has indeed been breached. After being notified that the images were freely accessible, the two doctors should have made the mandatory notifications, but failed to do so. According to the GDPR, this is a necessary step “unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons.” This data breach compromised the medical images of the doctors’ clients, directly infringing on their rights, making it necessary to notify the authority. 

 

CNIL made these decisions public in order to send a message to other medical professionals to ensure compliance with the GDPR.

 

While CNIL did not find it necessary to publicize the doctors’ names, they felt it was important to report on the incident to implore other health professionals to be vigilant with their measures for data protection. The aim is to encourage professionals to choose application solutions offering the maximum guarantees in terms of IT security and personal data protection. If not, these professionals risk the same fate for not being cautious when developing and configuring their internal IT system. The CNIL suggests that professionals employ competent service providers where necessary, to ensure compliance.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

ICO urges UK businesses

ICO urges UK businesses: ensure compliance to data protection law before the end of the UK’s transition.

ICO urges UK businesses to ensure compliance to data protection law before the end of the UK’s transition on December 31st 2020. 

 

December 31st 2020 will officially end the transitionary period for the UK, out of the EU, and the ICO is calling on UK businesses to ensure that if they are impacted by data protection law, that they should take the necessary steps to ensure continued lawful data flow from the EU. The ICO advises that any businesses receiving data from organisations in the EU or European Economic Area (EEA, which includes the EU, Iceland, Norway and Liechtenstein) will need to take action to ensure the flow of data doesn’t stop. 

 

Many SMEs depend on the flow of personal data to operate, and the ICO seeks to aid these businesses during the transition. 

Personal data applies to anything that relates to an identifiable individual whether it be information on customers or staff. HR records, customer details, payroll information and information collected through cloud services are all classified as personal data and will possibly be affected. The ICO recognises that sharing personal data is essential to running the majority of SMEs and that smaller organisations may not have dedicated data protection officers or specialists to help with the preparations. They have, as a result, published a statement advising businesses on steps they can take before January 1st to ensure continued compliance. 

The ICO urges UK businesses to maintain compliance with the DPA 2018 and the GDPR, and to double check their privacy information.

 

Businesses in the UK will need to continue to ensure compliance with the GDPR and DPA 2018. However, as it relates to the exchange of data between entities in the UK and the EU, as of January 1st 2021, businesses will need to ensure that they have safeguards in place to ensure that the continued flow of data is lawful. The ICO has gathered some guidance and resources on its website and urges businesses to make use of this to determine the actions they may need to take if they use personal data. In addition, businesses should review their privacy information and other documentation for possible changes that need to be made at the end of the transition period.

 

For most businesses and organisations, the ICO suggests Standard Contractual Clauses (SCCs) to keep data flowing on EU-approved terms. 

The ICO statement suggests that standard contractual clauses or SCCs may be the best option for businesses that use personal data and want to ensure their data transfers are EU-approved. As businesses in the UK will officially be treated as non EU processors or controllers, come January first, SCCs which have proven to be a sufficient safeguard for the transfers for data between controllers and processors within the EU and internationally, have been recommended as the best option for UK businesses to adopt post-transition. 

 

Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcingContact us today.

Google and Amazon fined

Google and Amazon fined: CNIL has fined the two major companies for unlawful cookies.

Google and Amazon, fined by CNIL of France, for placing cookies on users’ computers without getting prior consent or giving satisfactory information.

The CNIL reported last week that both companies have been sanctioned, for their misuse of cookies which breached the French Data Protection Act. Following several investigations from December 12th 2019 to May 19th 2020 on amazon.fr and on March 16th 2020 on google.fr, the CNIL discovered that the websites of both of these companies violated Article 82 of the Data Protection Act. 

Google was found to have three violations of Article 82 of the DPA, while Amazon had two of those three.

Both websites, upon investigation, were found to have been placing cookies on users’ computers automatically, without any action required on their part, or prior consent required from the users. These cookies were deemed non-essential to the use of their service and should only be placed once the user has expressed their consent. This practice violates Article 82, of the DPA and fails to comply with the requirement of obtaining prior consent before placing cookies on users’ computers. 

While both google.fr and amazon.fr issued brief statements via a banner pop-up to the bottom of their screens, informing visitors of either the company’s confidentiality agreement (in the case of Google), or the users acceptance of cookies by their use of the website (in the case of Amazon), both of these banners were found to have inadequately informed users, resulting in further breaches to Article 82. In Google’s case, this banner did not inform users at all, on the cookies which had already been automatically placed on their computers. The “Consult now” button which was placed on the banner at google.fr also did not lead users to any information on those cookies. 

On amazon.fr, while the banner informed users of their automatic acceptance of cookies by using the site, this information was found to be neither clear nor complete. The banner did not specify that cookies placed on users’ computers were mainly used to display personalized ads. It also failed to explain to the user that it could refuse these cookies or how to do it.

In addition, on google.fr, even after using the mechanism provided through the “Consult now” button, to deactivate the personalisation of ads, one of the advertising cookies remained stored on the user’s computer and continued to read information intended for the attached server. The “opposition” mechanism on google’s website was deemed faulty and resulted in an additional violation of the DPA, Article 82.

Google and Amazon fined a total of 100 million euros and 35 million euros respectively. 

GOOGLE LLC was hit with a fine of 60 million euros, and GOOGLE IRELAND LIMITED was fined 40 million euros. The authority justified these fines, and their decision to make them public, by the seriousness of Google’s triple breach of Article 82, the search engine’s reach and the fact that nearly fifty million users were affected by this breach. The advertising revenues generated by companies like Google are indirectly generated from the data collected by the advertising cookies placed on users’ computers. Since a September 2020 update on google.fr, cookies are no longer automatically placed on users’ computers, however the information banner still did not inform users residing in France of the purposes for which cookies are used, nor does it inform them that they could refuse these cookies. In addition to the fine charged to GOOGLE LLC and GOOGLE IRELAND LIMITED, an injunction was also placed under the penalty, threatening a 100,000 euro per day fine, if after three months, companies were still not adequately informing users, in accordance with DPA article 82. 

AMAZON EUROPE CORE was fined 35 million euros, and the fines were also publicized due to the seriousness of the breaches. It was considered that, given the popularity of the website amazon.fr, millions of France’s residents visited this site daily, having cookies placed on their computers. In addition, the main activity of the company is the sale of consumer goods, therefore the personalized ads, made possible by the use of those cookies, lead to a significant increase in the visibility of its products on other websites. It was also taken into account that, until the restructure of the website amazon.fr in September 2020, the company was continuously placing cookies on the computers of users living in France, without informing them. Regardless of the path that led users to the site, they were either insufficiently, or not at all informed that cookies were being placed on their computers. Amazon is also faced with the threat of an additional 100,000 euro per day fine, if they are not in accordance with the act within three months. 

CNIL has released amended guidelines and recommendations regarding the use of cookies, in accordance with the GDPR. 

On October 1st 2020, the CNIL released its guidelines on the use of cookies and other tracking devices. These guidelines are part of its action plan on targeting advertising and the enforcement of the GDPR. CNIL is asking all parties to comply with the rules clarified therein, specifying that their adaptation period should not exceed six months. CNIL has also indicated that it will continuously monitor other requirements which have not been modified and, if necessary, adopt corrective measures to protect the privacy of individuals.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.