GDPR Challenges For Artificial Intelligence

Data protection in algorithms

Technological development is enabling the automation of all processes, as Henry Ford did in 1914; The difference is that now instead of cars we have decisions about privacy. Since GDPR came into force on 25th May 2018, lots of questions have arisen regarding how the Regulation may block any data-based project.

In this article, we aim to clarify some of the main GDPR concepts that may apply to the processing of large amounts of data and algorithm decision-making. It has been inspired by the report the Norwegian Data Protection Authority -Datatilsynet- published in January this year: “Artificial Intelligence and Privacy”.

Artificial intelligence and the elements it comprises like algorithms and machine/deep learning are affected by GDPR for three main reasons: the huge volume of data involved, the need of a training dataset and the feature of automated decision-making without human intervention. These three ideas reflect four GDPR principles: fairness of processing, purpose limitation, data minimisation, and transparency. We are briefly explaining all of them in the following paragraphs – the first paragraph of each concept contains the issue and the second one describes how to address it according to GDPR.

One should  take into account that without a lawful basis for automated decision making (contract/consent), such processing cannot take place.

Fairness processing: A discriminatory result after automated data processing can derive from both the way the training data has been classified (supervised learning) and the characteristics of the set of Data itself (unsupervised learning). For the first case, the algorithm will produce a result that corresponds with the labels used in training, so if the training was biased, so will do the output. In the second scenario, where the training data set comprises two categories of data with different weights and the algorithm is risk-averse, the algorithm will tend to favour the group with a higher weight.

GDPR compliance at this point would require implementing regular tests in order to control the distortion of the dataset and reduce to the maximum the risk of error.

Purpose limitation: In cases where previously-retrieved personal data is to be re-used, the controller must consider whether the new purpose is compatible with the original one. If this is not the case, a new consent is required or the basis for processing must be changed. This principle applies either to the re-use of data internally and the selling of data to other companies. The only exceptions to the principle relate to scientific or historical research, or for statistical or archival purposes directly for the public interest. GDPR states that scientific research should be interpreted broadly and include technological development and demonstration, basic research, as well as applied and privately financed research. These elements would indicate that – in some cases – the development of artificial intelligence may be considered to constitute scientific research. However, when a model develops on a continuous basis, it is difficult to differentiate between development and use, and hence where research stops and usage begins. Accordingly, it is therefore difficult to reach a conclusion regarding the extent to which the development and use of these models constitute scientific research or not.

Using personal data with the aim of training algorithms should be done with a data set originally collected for such purpose, either with the consent of the parties concerned or, to anonymisation.

Data minimisation: The need to collect and maintain only the data that are strictly necessary and without duplication requires a pre-planning and detailed study before the development of the algorithm, in such a way that its purpose and usefulness are well explained and defined.

This may be achieved by making it difficult to identify the individuals by the basic data contained. The degree of identification is restricted by both the amount and the nature of the information used, as some details reveal more about a person than others. While the deletion of information is not feasible in this type of application due to the continuous learning, the default privacy and by design must govern any process of machine learning, so that it applies encryption or use of anonymized data whenever possible. The use of pseudonymisation or encryption techniques protect the data subject’s identity and help limit the extent of intervention.

Transparency, information and right to explanation: Every data processing should be subject to the previous provision of information to the data subjects, in addition to a number of additional guarantees for automated decision-making and profiling, such as the right to obtain human intervention on the part of the person responsible, to express his point of view, to challenge the decision and to receive an explanation of the decision taken after the evaluation.

GDPR does not specify whether the explanation is to refer to the general logic on which the algorithm is constructed or the specific logical path that has been followed to reach a specific decision, but the accountability principle requires the subject should be given a satisfactory explanation, which may include a list of data variables, the ETL (extract, transform and load) process or the model features.

A data protection impact assessment carried by the DPO is required before any processing involving algorithms, artificial intelligence or profiling in order to evaluate and address the risk to the rights and freedoms of data subjects.

 

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessment, and Data Protection Officer outsourcing.

GDPR biometric data explained by Spanish DPA

Spain Supervisory Authority (AEPD) opinion on GDPR biometric data

AEPD 10thAnnual Session took place last June, and some of the main questions that were addressed in the meeting have now been publicly published.

Participants were specially concerned about GDPR biometric data and its processing under certain circumstances, like labour sphere.

Spanish Data Protection Legislation previous to GDPR (LOPD) did not contain any specific definition of “biometric data”, but it was instead included within the general concept of “personal data”. It means that there were no particular requirements to be taken into account for the processing of such information.

According to RGPD, “biometric data” is “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”. Additionally, GDPR biometric data is a special category of personal data (Article 9 GDPR), which means that its processing shall be prohibited except for some cases, like explicit consent from the data subject.

As one could note, there is a big difference between the previous legislation (AEPD) and the current one (RGPD), which has not been totally implemented yet, so that is why AEPD opinion becomes so important for latest and future data protection issues in Spain.

Participants asked about the use of biometric technology with facial recognition in case any Article 9 GDPR exception apply. AEPD claimed that minimisation and lawfulness should govern any data processing. However, two scenarios were underlined: labour sphere and critical infrastructures. The latter requires additional security measures that might themselves justify the use of biometric technologies. Labour sphere is subject to specific Labour Legislation (“Estatuto de los Trabadores (ET)” in Spain) which imposes its own requirements. AEPD stated that, according to such Legislation, the use of biometric data in companies falls under the scope of employee monitoring, so it is subject to proportionality and prior information instead of employees’ consent. Nevertheless, AEPD did stress the importance of good practices, and asserted that it is highly recommended to avoid storing such data (e.g. including the data in a smart card which is always in employees’ possession).

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services and Data Protection Officer outsourcing.

GDPR parents' access

GDPR parents’ access to children over 18 university marks

Spanish supervisory authority (Agencia Española de Protección de Datos – AEPD) has published an opinion on GDPR parents’ access right to their children over 18 University marks and other associated information.

GDPR parents' access

Data related to University enrolment, marks or scholarship is personal data according to GDPR, and where the University is disclosing such information to the parents, the University is processing personal data, which means that it must comply with GDPR requirements and be covered by a legal basis. So what about GDPR parents’ access to their adult childrens’ student data?

Lawfulness of processing comprises other scenarios apart from consent, so even where the student has not consented the disclosing, parents may access the data. Article 6.1 (f) states that processing shall be lawful if “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”. Due to the Regulation is quite similar to the previous one at this point, AEPD has largely relied on Court of Justice of the European Union (CJEU)’s pronouncements in this matter.

AEPD refers to ECLI:EU:C:2017:336 CJEU’s Judgment and claims that Article 6.1 (f) requirements are cumulative ones, so all of them need to be met before the disclosure of personal data in these cases:

  • There are legitimate interests pursued by the controller or by a third party;
  • Processing is necessary to fulfill such legitimate interests;
  • Legitimate interests are not overridden by the interests or fundamental rights and freedoms of the data subject.

AEPD asserts that legitimate interests exist where the student is economically dependent, and the University is supported financially by parents. Such dependency criterion applies as well to child support scenarios and other similar ones, but, according to AEPD, it does not apply where the son / daughter is missing and there are evidences that show a lack of choice in leaving home.

Anyway, the data subject shall have the right to object, so the controller must inform about the disclosure to let him/her be aware of it. In case the individual exercises that right, the controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services and Data Protection Officer outsourcing.

Spain approves new Data Protection Law

New Royal Decree-Law on Data Protection in Spain

A new Royal Decree-Law on Data Protection has been approved in Spain as part of the GDPR adaptation process. A Royal Decree-Law is a legal rule having the force of a law in the Spanish legal system. This is an important regulatory measure for Privacy in the country since currently, GDPR coexists with the previous and still applicable national data protection law (“Ley Orgánica de Protección de Datos” – “LOPD”); both laws are valid but they contradict each other at some points, which results in difficult situations to resolve, even though GDPR prevails over LOPD in the event of a conflict between them.

The Royal Decree-Law (“RDL 5/2018”) does not cover the whole GDPR; it only standardises the following subjects:

  • Chapter I. Investigatory powers of the national supervisory authority (“Agencia Española de Protección de Datos” – “AEPD”) and the rules related to joint operations of supervisory authorities.
  • Chapter II. Conditions for imposing administrative fines, especially:

-Subjects responsible for infraction: controllers, processors, representatives in the EU of non-EU controllers and processors, certification entities and entities supervising codes of conduct. It states that the data protection officer shall not be responsible.

– Limitation periods for infractions: three years for infringements of article 83.5 and 83.6 GDPR (20.000.000 EUR / 4% of the total worldwide annual turnover of the preceding financial year) and two years for infringement of article 85.4 GDPR (10.000.000 EUR / 2% of the total worldwide annual turnover of the preceding financial year).

-Limitation periods for paying fines (one year up to €40 000, two years from €40 001 to €300 000, three years over that amount).

  • Chapter III. Conditions for preliminary investigation.

-Procedures where data subjects rights are involved (Articles 15-22 GDPR) shall be settled in six months; the principle of Positive Administrative Silence applies here.

-Procedures related to GDPR infraction shall be settled in nine months.

Provisions that may conflict with the terms of the new RDL 5/2018 are declared no longer in force (especially, articles of LOPD related to investigatory powers and the rules for imposing fines and penalties).

The Royal Decree-Law will be in force until the new Spanish Data Protection Act is declared, which is expected to happen at the end of 2018 or the beginning of 2019. In this regard, it is relevant to note that privacy is a constitutional right in Spain, which means that the new Spanish Data Protection Act requires special majorities inside the Parliament and a lengthy passing process.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services and Data Protection Officer outsourcing.