GDPR territorial scope

The European Data Protection Board publishes guidelines on the territorial scope of the GDPR.

The European Data Protection Board (EDPB) has recently published guidelines on the territorial scope of the GDPR, in order to clarify the cases where GDPR applies according to Article 3. Territorial scope of the GDPR is defined based on two main criteria: the “establishment” criterion (1) and the “targeting” criterion (2).

  • -Processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.

The concept of establishment extends to any real and effective activity, even where it is minimal, exercised through stable arrangements. It may include activities carried out over the internet even if there is only one single employee or agent with presence in the Union, where he or she acts with a sufficient degree of stability.

In the context of” involves all those processing activities taking place outside the Union that are inextricably linked to the activities of a local establishment in a Member state. “Inextricable link” is therefore the criterion to determine the application of the GDPR in the context of an establishment in the Union, but EDPB considers that it should be analysed on a case-by-case basis and additional elements like revenue-raising in the EU should also be taken into account.

EDBP underlines that a non-EU controller having a processor in the Union does not imply that such controller is processing data in the context of an establishment in the Union, because the processor merely provides a service, which does not qualify as activity “inextricably linked”.

  • -Processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to the offering of goods or services, irrespective of whether a payment of the data subject is required, or the monitoring of their behaviour.

EDBP stresses the location of the data subject in the territory of the Union as the determining factor to be assessed at the moment when the relevant trigger activity takes place, while nationality or legal status of a data subject are not relevant to this extent. This criterion will not apply when the processing of personal data relates to an individual alone.

In addition, this criterion will only trigger the application of GDPR where the conduct on the part of the controller or processor clearly demonstrates its intention to offer goods or services to a data subject located in the Union, which would be ascertained based on some elements such the designation by name of a Member State with reference to the good or service offered, the use of EU search engines, the features of the marketing campaigns or the existence of specific addresses, telephone numbers, domain, currency or language for the EU.

  • -Furthermore, GDPR will as well apply to the processing of personal data by a controller not established in the Union, but in a place where Member State law applies by virtue of public international law.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessment, and Data Protection Officer outsourcing.

Microsoft Data Controller in the Netherlands

The results of the Dutch Data Protection Impact Assessment (DPIA) shows that Microsoft collects and stores large-scale personal data about the behaviour of individuals

In the Netherlands government organisations use Microsoft services to store data locally. It is inevitable that Microsoft collect personal data, such as email, IP address as such data is necessary for individuals to use their services, but Microsoft has been collecting large-scale data covertly, without informing users. The individuals that use the services provided by Microsoft do not get a choice in the amount of data that is collected, the possibility to opt-out or the ability to view their collected personal data.

Microsoft determines the purposes of the processing of the diagnostic data in the Office software, and the retention period of the data (30 days up to 18 months, or even longer if deemed necessary by Microsoft). The DPIA report shows that Microsoft processes the diagnostic data for 7 purposes, and for all other purposes Microsoft deems to be compatible with those purposes. Microsoft acts as a controller, and not as a data processor because it determines the purposes and the means (of the retention period.

The 2017 investigation of the processing of telemetry data in the consumer and small business versions of Windows 10 (Home and Pro), conducted by The Dutch Data Protection Authority (DPA), found that Microsoft violated data protection laws, such as purpose limitation and lack of purpose of processing and lack of transparency. In response to that investigation, Microsoft made some adjustments in the spring of 2018 release of the software and those adjustments the Dutch DPA confirms will minimize risk.

However the report concludes by stating that further mitigating solutions have to be taken to completely eliminate risks but until that is done, the risks will persist and it is up to the individuals to apply additional measures to protect their personal data and privacy.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessment, and Data Protection Officer outsourcing.

GDPR Challenges For Artificial Intelligence

Data protection in algorithms

Technological development is enabling the automation of all processes, as Henry Ford did in 1914; The difference is that now instead of cars we have decisions about privacy. Since GDPR came into force on 25th May 2018, lots of questions have arisen regarding how the Regulation may block any data-based project.

In this article, we aim to clarify some of the main GDPR concepts that may apply to the processing of large amounts of data and algorithm decision-making. It has been inspired by the report the Norwegian Data Protection Authority -Datatilsynet- published in January this year: “Artificial Intelligence and Privacy”.

Artificial intelligence and the elements it comprises like algorithms and machine/deep learning are affected by GDPR for three main reasons: the huge volume of data involved, the need of a training dataset and the feature of automated decision-making without human intervention. These three ideas reflect four GDPR principles: fairness of processing, purpose limitation, data minimisation, and transparency. We are briefly explaining all of them in the following paragraphs – the first paragraph of each concept contains the issue and the second one describes how to address it according to GDPR.

One should  take into account that without a lawful basis for automated decision making (contract/consent), such processing cannot take place.

Fairness processing: A discriminatory result after automated data processing can derive from both the way the training data has been classified (supervised learning) and the characteristics of the set of Data itself (unsupervised learning). For the first case, the algorithm will produce a result that corresponds with the labels used in training, so if the training was biased, so will do the output. In the second scenario, where the training data set comprises two categories of data with different weights and the algorithm is risk-averse, the algorithm will tend to favour the group with a higher weight.

GDPR compliance at this point would require implementing regular tests in order to control the distortion of the dataset and reduce to the maximum the risk of error.

Purpose limitation: In cases where previously-retrieved personal data is to be re-used, the controller must consider whether the new purpose is compatible with the original one. If this is not the case, a new consent is required or the basis for processing must be changed. This principle applies either to the re-use of data internally and the selling of data to other companies. The only exceptions to the principle relate to scientific or historical research, or for statistical or archival purposes directly for the public interest. GDPR states that scientific research should be interpreted broadly and include technological development and demonstration, basic research, as well as applied and privately financed research. These elements would indicate that – in some cases – the development of artificial intelligence may be considered to constitute scientific research. However, when a model develops on a continuous basis, it is difficult to differentiate between development and use, and hence where research stops and usage begins. Accordingly, it is therefore difficult to reach a conclusion regarding the extent to which the development and use of these models constitute scientific research or not.

Using personal data with the aim of training algorithms should be done with a data set originally collected for such purpose, either with the consent of the parties concerned or, to anonymisation.

Data minimisation: The need to collect and maintain only the data that are strictly necessary and without duplication requires a pre-planning and detailed study before the development of the algorithm, in such a way that its purpose and usefulness are well explained and defined.

This may be achieved by making it difficult to identify the individuals by the basic data contained. The degree of identification is restricted by both the amount and the nature of the information used, as some details reveal more about a person than others. While the deletion of information is not feasible in this type of application due to the continuous learning, the default privacy and by design must govern any process of machine learning, so that it applies encryption or use of anonymized data whenever possible. The use of pseudonymisation or encryption techniques protect the data subject’s identity and help limit the extent of intervention.

Transparency, information and right to explanation: Every data processing should be subject to the previous provision of information to the data subjects, in addition to a number of additional guarantees for automated decision-making and profiling, such as the right to obtain human intervention on the part of the person responsible, to express his point of view, to challenge the decision and to receive an explanation of the decision taken after the evaluation.

GDPR does not specify whether the explanation is to refer to the general logic on which the algorithm is constructed or the specific logical path that has been followed to reach a specific decision, but the accountability principle requires the subject should be given a satisfactory explanation, which may include a list of data variables, the ETL (extract, transform and load) process or the model features.

A data protection impact assessment carried by the DPO is required before any processing involving algorithms, artificial intelligence or profiling in order to evaluate and address the risk to the rights and freedoms of data subjects.

 

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessment, and Data Protection Officer outsourcing.

Facebook privacy fine

For failing to protect users’ personal information, Facebook privacy fine comes up to £500,000, ICO’s maximum pre-GDPR fine.

Facebook privacy fine totals up to £500,000, due to the serious breaches of data protection law. In July, the ICO issued a Notice of Intent to fine Facebook as part of a wide ranging investigation into the use of data analytics for political purposes.

After considering representations from the company, Facebook privacy fine was issued by the ICO who confirmed that the amount – the maximum allowable under the laws which applied at the time the incidents occurred – will remain unchanged. The full penalty notice can be read here.

How did Facebook fail to protect its users?

The ICO’s investigation found that between 2007 and 2014, Facebook processed the personal information of users unfairly by allowing application developers access to their information without sufficiently clear and informed consent, and allowing access even if users had not downloaded the app, but were simply ‘friends’ with people who had.

Facebook also failed to keep the personal information secure because it failed to make suitable checks on apps and developers using its platform. These failings meant one developer, Dr Aleksandr Kogan and his company GSR, harvested the Facebook data of up to 87 million people worldwide, without their knowledge. A subset of this data was later shared with other organisations, including SCL Group, the parent company of Cambridge Analytica who were involved in political campaigning in the US.

Even after the misuse of the data was discovered in December 2015, Facebook did not do enough to ensure those who continued to hold it had taken adequate and timely remedial action, including deletion. In the case of SCL Group, Facebook did not suspend the company from its platform until 2018.

What does it mean for users in the UK? The ICO found that the personal information of at least one million UK users was among the harvested data and consequently put at risk of further misuse.

Elizabeth Denham, Information Commissioner, said: “Facebook failed to sufficiently protect the privacy of its users before, during and after the unlawful processing of this data. A company of its size and expertise should have known better and it should have done better.”

“We considered these contraventions to be so serious we imposed the maximum penalty under the previous legislation. The fine would inevitably have been significantly higher under the GDPR. One of our main motivations for taking enforcement action is to drive meaningful change in how organisations handle people’s personal data.

“Our work is continuing. There are still bigger questions to be asked and broader conversations to be had about how technology and democracy interact and whether the legal, ethical and regulatory frameworks we have in place are adequate to protect the principles on which our society is based.”

The Facebook privacy fine was served under the Data Protection Act 1998, which was replaced in May by the new Data Protection Act 2018, alongside the EU’s General Data Protection Regulation. These provide a range of new enforcement tools for the ICO, including maximum fines of £17 million or 4% of global turnover.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessment, and Data Protection Officer outsourcing.