Withdrawal of consent

Withdrawal of Consent Should be Easy

Withdrawal of consent requests could have dire consequencesfor your company if they are not immediately and seamlessly processed.

Withdrawal of consent should be just as easy as giving consent.  So says the President of the Polish Data Protection Authority. This assessment came as a result of the review of practices by company ClickQuickNow.

According to the investigation the mechanism utilized by ClickQuickNow for processing requests for withdrawal of consent did not result in a quick withdrawal but rather involved the use of a link included in the commercial information. After the link was set up, messages addressed to the person interested in withdrawing consent were misleading, the EDPD article reports. Additionally, the company required that individuals who submitted consent removal requests must state the reason for withdrawing consent. Failure to provide this reason resulted in the discontinuation or the process of withdrawing.

It was further noted that ClickQuickNow also processed the data of subjects who were not its customers and from whom they had received objections to processing their personal data, without any legal basis.

These practices by ClickQuickNow were direct violations of the GDPR, particularly Articles 7(3), 12(2) and 17. It was further asserted that ClickQuickNows practices violated the principles of lawfulness, fairness and transparency of the processing of personal data, specified in the GDPR. As a results of these violations, the Polish DPA has levied an administrative fine of PLN 201,000 (Approximately EUR 47,000) on ClickQuickNow. Further the Polish DPA  mandated that ClickQuickNow adjusts its means of processing withdrawal of consent requests and deletes the data of data subjects who are not its customers and objected to the processing of the personal data concerning them.

What mechanisms does your company have to action an individuals request for withdrawal of consent? Are your practices easy and seamless? If not, this could result in severe consequences. Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Hash function

Hash function as a personal data pseudonymisation technique

Spanish Data Protection Authority (AEPD) has launched new guidance on hash function as a personal data pseudonymisation technique.

GDPR refers pseudonymization of personal data as one of the appropriate technical and organisational measures that may be taken by data controllers in order to ensure a level of security appropriate to the risk. However, it does not specify how data can be pseudonymised. In this context, hash function may be a suitable technique for such purpose and, lucky us, AEPD has prepared some guidancein order to clarify how it works. Do you want to learn more about hash function as a personal data pseudonymisation technique? Keep reading!

What does ‘pseudonymisation’ means?

According to the GDPR, ‘pseudonymisation’ means “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”.

What is a hash function?

Have you ever tried to write a tweet with more than 280 characters and have not been able because of Twitter character limit? Hash function has come to save you!

A digest or hash function is a process which transforms any random dataset in a fixed length character series, regardless of the size of input data. For example, the full text of Romeo and Juliet may become a series of just one hundred numbers after being run through a hash function. You may be wondering: how? Hash functions divide the input message into blocks, calculate the hash for each of the blocks and add up them all.

Hash and reidentification

How likely is the output of a hash to be reverted to the initial input? Let’s imagine a processing activity whichintends to associate a hash value to each National Insurance Numbers in UK. The main element that would allow or hinder reidentification is the “order” in the message space.

The message space is represented by all possible datasets which may be created and from which a hash may be generated (in our case, UK National Insurance Numbers).The stricter this “order” is (for example, in the case that only National Security Numbers from women who are 30-45 years old were admitted), the smaller the set of numbers (processing message space) will be. Thisguarantees the hash effectivity as a single identifier (no collision) but it also increases the likelihood of identifying the original message from the hash.

The degree of disorder in a dataset is called entropy. The smaller the message space and the lower the entropy are, the lower the risk of collision in hash processing is, but re-identification will be more likely and vice versa: the higher the entropy, the higher the possibility of a collision, but the risk of reidentification will be lower. This is the reason why measuring the amount of information is one of the key factors to consider whenever a message is protected via hash functions or any other pseudonymization or encryption techniques.

How does this apply to the day-to-day business? This basically means that the more variables that “order” the message space (e.g. individuals age, gender, socioeconomic status, nationality, etc.), the higher the risk of re-identification (e.g. the higher the risk of singling out an individual).

The risk of re-identification is even higher when additional information is linked to the hash.

Strategies to hinder re-identification

A strategy to hinder re-identification of the hash value is to use an encryption algorithm with a key that is confidentially stored by the data controller or with the other person taking part in the processing, so that the message is properly encrypted before the hash is completed.

The effectiveness of the encryption will depend on the environment (distributed environments may increase this risk), the vulnerability to attacks and the volume of encrypted information (the more information, the easier it will be to carry out a cryptanalysis), among others.

As an alternative to encryption, random fields may be added to the original message, so the format of the original message is expanded to an “extended message”, which increases its entropy.

However, the computation of the hash itself (e.g. selection of an specific algorithm and its implementation), message space related aspects (e.g. entropy), linked information, physical safety and human factors, etc. includes a series of weaknesses and introduces different risk elements that makes hash function a pseudonymization technique rather than an anonymization one.

According to the AEPD, using hash techniques to pseudonymise or anonymise personal data must be justified by a re-identification risk analysis associated with the specific hash technique used in the processing. “In order to consider the hash technique an anonymisationtechnique, this risk analysis must also assess:

The organisational measures that guarantee the removal of any information that allows for reidentification.
The reasonable guarantee of the system robustness beyond the expected useful life of personal data.”.

Do you require assistance with pseudonymisation and anonymisation techniques? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Automated decision making and GDPR

Automated Decision Making and the GDPR

In today’s blog we delve into automated decision making and the GDPR.

Artificial Intelligence is increasingly becoming ingrained in all facets of our societies and lives. While it certainly heralds an age of cool futuristic technology and applicationsfacial recognition and self-driving cars for example!what about when AI is utilized as an automated decision making tool? Can this pose an issue to an individuals right? What are the possible implications? Is it fair? Are there any legal provisions to ensure fairness?

In our latest vlog we explore some frequently asked questions as it relates to Artificial Intelligence, automated decision making and the GDPR. Click here to take a look.

A deeper look: GDPR and Automated Individual Decision making, including profiling

Automated decision-making is described by the ICO  asthe process of making a decision by automated means without any human involvement.

These decisions, the ICO says, can be based on factual data, as well as on digitally created profiles or inferred data. Examples of this include:

an online decision to award a loan; and
an aptitude test used for recruitment which uses pre-programmed algorithms and criteria.

Meanwhile Article 4 (4) defines profiling as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.

The ICO offers the following examples of profiling:

collect and analyse personal data on a large scale, using algorithms, AI or machine-learning;
identify associations to build links between different behaviours and attributes;
create profiles that you apply to individuals; or
predict individualsbehaviour based on their assigned profiles.

Yet while automated decision making and profiling have several benefits for both businesses and consumers, they carry risks for people’s rights and freedoms. A false or unfair decision may lead to significant adverse effects for individuals, from discrimination to undue intrusion into private life.

Article 22 of the GDPRreferenced in our vlogseeks to address this and other risks by setting the strict parameters that the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

GDPR’s right to object fine

Failure to adhere to GDPR’s Right to Object results in EUR 200,000 Fine

Hellenic Telecommunications Company fined EUR 200,000 for failure to remove email addresses from direct marketing database in keeping with GDPRs right to object.

Have you ever clicked unsubscribe from a marketing emailing list but still continued to receive emails? From experience, Im willing to go out on a limb and say that the likelihood of this occurrence is high. While this may seem like no big deal; for companies who fail to act on requests for something as seemingly simple as removing an email address from a database, the implications can be dire. This is because it is a direct infringement of the GDPRs right to object.

In fact, just last month Hellenic Telecommunications Organization (OTE) was fined EUR 200,000 by the Hellenic DPA for infringement of the right to object to the processing for direct marketing purposes and failure to establish an adequate data protection by design in accordance with the GDPR.

According to the European Data Protection Board (EDPB) the Hellenic DPA has received complaints from the recipients of advertising messages from OTE concerning their lack of ability to unsubscribe from the list of recipients of advertising messages. The EDPB article  offers that in the course of the examination of the complaints, it emerged that from 2013 onwards—due to a technical error—the removal from the lists of recipients of advertising messages did not operate for those recipients who used the unsubscribe” link. OTE did not have the appropriate organisational measure, i.e. a defined procedure by which it could detect that the data subjects right to object could not be satisfied. The OTE has since removed some 8000 persons from the addresses of the messages.

Direct Marketing and the GDPR

Under the GDPRs Right to Object, Article 21.2 and 21.3 state:

1. “ Where personal data are processed for direct marketing purposes, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, which includes profiling to the extent that it is related to such direct marketing.
2. Where the data subject objects to processing for direct marketing purposes, the personal data shall no longer be processed for such purposes.”

Data protection by default and ePrivacy rules

Meanwhile Article 25 (2) of the GDPR offers that The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons, including that contact details are not automatically accessed and used by the marketing teams.

Where businesses rely on the opt-out rules of the ePrivacy Directive, they need to be careful. “Most EU jurisdictions require an actual purchase to be made before one can rely on the opt-out rule for marketing emails,” explains Dr Bostjan Makarovic, Aphaia managing partner. “In such cases, any marketing emails may only relate to the business’s own similar goods or services, plus easy opt-out needs to be enabled both at the time of email address gathering, as well as in each email sent.”

Does your company maintain a direct marketing database? Has an efficient Data Protection Design been established? Aphaiadata protection impact assessments and Data Protection Officer outsourcing will assist you with ensuring compliance. Contact us today.