Our blog editor Vasiliki Antoniadou explains the latest Article 29 Working Party GDPR profiling guidelines in relation to automated decision making – and how they might affect your business.
The technological evolution and specifically the development of big data analytics, IoT and artificial intelligence permit automated processing of personal data in order to evaluate certain personal aspects of an individual and automated decision making without human involvement. The above phenomena, namely automated decision making and profiling, are increasingly being used in banking and finance, advertising, healthcare and taxation entailing the obvious advantages of resource savings and high efficiencies.
However, automated decision making and profiling may pose risks for the rights and freedoms of personal data subjects. These practices could result in restriction to suggested preferences or even social segregation and discrimination. In order to prevent adverse effects on individuals, GDPR includes safeguards regarding the solely automated decision making, including profiling.
Automated decision making, including profiling, with legal or similarly significant effects
According to the general GDPR profiling rule, solely automated decision making, including profiling, is prohibited when legal of similarly significant effects ensue. Legal effects are defined as impacts on someone’s rights, legal status or rights under a contract. For instance, an automated decision that denies or entitles someone to housing benefit or entry in a country is considered to have legal effects. Similarly significant effects are evaluated by their influence to circumstances, behaviour and choices and could potentially lead to exclusion or discrimination. E-recruiting practices without human involvement is an example of automated decision having similarly significant effect.
Nevertheless, the following three exceptions apply to general GDPR profiling prohibition, when the personal data processing is:
It should be noted that the interpretation of necessity is narrow and it requires an assessment of whether a less intrusive measure could be implemented. Moreover, the consent of an individual should be expressed by an explicit statement and not by any other affirmation.
Transparency obligation, human intervention and safeguards
In cases where the automated decision making, including profiling, is not prohibited, the controller should inform the personal data subjects that they conduct this sort of activity, explain the logic behind the activity and the importance and envisaged consequences of the processing.
As an additional protection layer, even when an exception does apply the data subjects are entitled to human intervention, expression of their point of view and challenge of the automated made decision. Notably, the authority responsible should carry out a thorough assessment and have the capability to overturn the automated decision.
Since the risk of errors and bias exists, controllers should adopt effective procedures to frequently check the accuracy and to minimise errors and discrimination.
GDPR profiling Data Protection Impact Assessment (DPIA)
Data protection impact assessment is particularly useful, when a controller undertakes automated decision making and profiling. DPIA enables the controller to assess the risks and identify the suitable measures and procedures that should be put in place in order to comply with the new regulation provisions.
It is pertinent to mention that the general GDPR requirement of fair and lawful processing, the obligation of the controller to provide individuals with a privacy notice and the rights of individuals with respect to objection, rectification and erasure apply in this case too.