London pharmacy fined for failing to ensure the security of special category of personal data

The pharmacy left 500,000 documents in unlocked containers at the back of its premises.

Failing to ensure the security of special category of personal data may trigger large fines under the GDPR.

Are your devices password-protected? Do you make sure that you only use cloud services that encrypt the data? If your answer to these questions is ‘yes’, then one could say that you have the cybersecurity essentials covered but… wait! This does not necessarily mean that the data you process is safe. What about physical security? This one is as important as cybersecurity, but businesses seem to care less about this because it is part of the ‘analogue world’. However, it involves several risks that have triggered some fines since the GDPR started to apply, specially when it comes to failing to ensure the security of special category of personal data.

The latest fine in this regard was imposed by the ICO on Doorstep Dispensaree Ltd, a London-based pharmacy, on 20th December. They will have to pay £275,000 for failing to ensure the security of special category of personal data.

What happened?

The pharmacy left approximately 500,000 documents, dated between June 2016 and June 2018, in unlocked containers at the back of its premises. The documents included names, addresses, dates of birth, NHS numbers, medical information and prescriptions belonging to an unknown number of people.

Unfortunately, similar breaches take place more often than what could be expected. A school was fined by the Spanish DPA (AEPD) in a similar sense last year. The employees of the cleaning service threw documents into a container, including student’s exams containing personal data, without taking appropriate measures for the destruction of such documents.

Why is this a breach of the GDPR?

Leaving any type of personal data in unlocked containers is a breach of the GDPR as anyone could access the information without any legitimate basis. Furthermore, in this case the documents contained health information, which is a special category of personal data that needs additional protection.

Failing to ensure the security of special category of personal data is therefore a serious breach of the GDPR.

What does the GDPR say?

Article 32.1 GDPR states that “Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk”. It seems that the technical and the organisational measures were missing in this case, or at least they were not appropriate to ensure the level of security corresponding to the risk.

Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments and Data Protection Officer outsourcing.  Contact us today.

What does new Schrems II case mean for businesses?

CJEU’s Advocate General Henrik Saugmandsgaardøe publishes his opinion in the so-called ‘Schrems II’ case.

New Year, new regulation concerns? Two weeks before the end of 2019, Court of Justice of the European Union’s (CJEU) Advocate General delivered his opinion in the case known as ‘Schrems II’, concerning the validity of the Standard Contractual Clauses (SCCs).

Article 46 GDPR refers SCCs as a valid safeguard that businesses can incorporate to contracts in order to make personal data transfers to third countries. SCCs contain contractual obligations on the data exporter and the data importer, and rights for the individuals whose personal data is transferred. Individuals can directly enforce those rights against the data importer and the data exporter.

Let’s recap first.

What is ‘Schrems II’ case about?

‘Schrems II’ is a sequel to the complaint made in 2013 by Max Schrems, in connection with Facebook and transfers of personal data to the U.S. The complaint was brought to the Irish DPA and was referred to the CJEU, who declared Safe Harbour invalid.

As a consequence, businesses could no longer rely on Safe Harbour for international data transfers and started to base them on SCCs instead. In 2016, the EU Commission replaced Safe Harbour with ‘Privacy Shield’ in light of the ‘Schrems I’ case.

In ‘Schrems II’, Max Schrems issued a new complaint to the Irish DPA, with a similar approach to SCCs as the one taken with Safe Harbour. Advocate General has now issued their opinion.

What is the Advocate General opinion in Schrems II ?

CJEU’s Advocate General has reaffirmed the sufficiency of SCCs. However, he has suggested that businesses and data protection authorities (DPAs) should assess the sufficiency of foreign countries’ national security protections on a case-by-case basis.

The opinion states that: “a supervisory authority must examine with all due diligence the complaint lodged by a person whose data are alleged to be transferred to a third country in breach of the standard contractual clauses applicable to the transfer” and “where appropriate, it must suspend the transfer if it concludes that the standard contractual clauses are not being complied with and that appropriate protection of the data transferred cannot be ensured by other means.”

Why does the Advocate General opinion in Schrems II not surprise?

The above suggests that transferring personal data to third countries will require more efforts than just adding SCCs to the agreement with the importer. Businesses will need to ensure that SCCs are being complied with in practice. In my view though, this burden on the controllers is not something new and can be derived from the controllers’ general responsibility to demonstrate compliance with the GDPR principles, namely: lawfulness, fairness, transparency, data minimisation, purpose limitation, accuracy, storage limitation, integrity and confidentiality.

These principles apply to any international transfers of personal data, regardless of the transfer safeguards used. Whereas the Irish DPA highlights that this could result in fragmentation amongst supervisory authorities within the Member States, it might be unavoidable when it comes to practical application of GDPR absent common opinions by the European Data Protection Board (EDPB) or case law.

Whereas the CJEU is not bound by the opinion, Advocate General’s views are typically followed by the Court in the majority of cases. The CJEU is expected to issue a final decision in the coming months.

Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.  Contact us today.

ICT regulation 2020

ICT Regulation in 2020: What to expect? An Aphaia Perspective

 

Aphaias Managing Partner Bostjan Makarovic and Partner Cristina Contero Almagro weigh in on ICT regulation in 2019 and offer their predictions and hopes for 2020.

 

To say it has been an eventful 2019 for data protection, ICT Governance and ePrivacyspecifically within the EU and United Kingdomwould be an understatement. Indeed, with 2019 being the first full year with the GDPR, it proved to be a year of lessons, policy implementations, new developments, court rulings and fines all centred on honouring the privacy and rights of individuals in todays highly technical, online based era. In fact, Privacy Affairs reports a total of 150 fines totaling 103,852,871 for the year, with a 50 million sanction on Google being the largest fine of the year.

 

So, with 2019 winding down to give way to 2020, we sat down with Aphaias Managing Partner Bostjan Makarovic and Aphaia Partner Cristina Contero Almagro for their professional insights on the year passed and their expectations and projections for 2020.

 

From a data protection and AI ethics standpoint How would you describe 2019? What would you pinpoint as two of the most impactful occurrences in regards to ICT regulation in the year just past?

 

Bostjan: 2019 has been the year when the topic of AI seems to have found a special place in the EU’s regulatory landscape. In addition, important new practical questions on the intersection of privacy and AI regulation have emerged, say in relation to smart billboards.

 

Cristina: AI Ethics standpoint: I would say 2019 has been a turning year. On 8 April 2019, the High-Level Expert Group on AI presented their Ethics Guidelines for Trustworthy Artificial Intelligence, which was part of a series of four documents. In April we also became members of the European AI Alliance, a multi-stakeholder forum for engaging in a broad and open discussion of all aspects of AI development and its impact on the economy and society, which allows us to interact with the AI-HLEG. The first AI Assembly took place on 26th June in Brussels and we were invited to attend, so we did. The Policy and Investment Recommendations on AI and the piloting process of the AI Ethics Guidelines were launched at this event. This year has also been the year of our YouTube channel, and we hope to keep working on our vlogs during 2020.

 

Data protection standpoint: 2019 has been the first whole year with the GDPR, as it started to apply in May 2018. We have been able to learn from the fines and the guidelines launched both from Member States DPAs and EU bodies, as the EDPB. One of the most expected event of this year was the publication of the cookies guidance from DPAs (ICO in UK, AEPD in Spain, CNIL in France, etc.), although we will still have to wait for the new ePrivacy Regulation.

 

 

As we look ahead to 2020, from your analysis what are some expectations? Do you foresee any changes or implementations that would be have a big effect on the way businesses operate?

 

Cristina: I personally hope that EU Guidelines rise awareness of the importance of ethics, and that this addresses the approval of code of conducts for the industry. We also expect a revised ePrivacy Regulation proposal as part of the forthcoming EU Croatian Presidency. 

 

It would be also great to see how 2020 becomes the year of 5G, as it will definitely impact the way we do businesses, and our lives as such, plus it is closely linked to data protection and AI Ethics. There is a lot of work to do there. It is challenging and we are looking forward to this becoming a reality. Smart cities, self-driving cars, AR… there is a whole world outside waiting for 5G!

 

We cannot forget about Brexit, that may severely impact data protection and AI ethics across Europe.

 

Bostjan: In the second half of 2020, the new European Electronic Communications Code (EECC) will directly affect both communications services and telecoms infrastructure providers across the EU. I am also wondering whether in 2020 European Commission might seriously start looking into the possibility of a mandatory regulatory framework for AI, in addition to that of GDPR.

 

 

What advice would you give to online businesses and companies utilizing AI to ensure they get on top of the changes coming in 2020?

 

Cristina: With no doubtsThey should contact Aphaia! (just kidding). What I would advise that they look at the past and hear their customers. Look at the past because, with the example of GDPR for instance, it is easy to see how costly not doing the right thing from the beginning is, and hearing their customers, because the audience is demanding trustworthy AI, and they may not see a negative impact of not providing it for now, but it is just a matter of time, ‘adapt or die’.

 

 

Bostjan: As Cristina pointed out, getting timely compliance advice is crucial. GDPR requirement for ‘data protection by design and by default’ already requires businesses to look into privacy matters at the point of the development of the product, not once it has been finalised or even launched. In the second half of 2020, many online businesses providing voice, chat or messaging platforms will also need to ensure they comply with the EECC.

 

 

Do you need assistance in ICT policy or regulation? Aphaia provides  GDPR and UK Data Protection Act 2018 consultancy services, data protection impact assessments,  Data Protection Officer outsourcing , AI ethics assessments and telecoms policy and regulation consultancy services.

 

EDPB Guidelines Right to be Forgotten

EDPB guidelines on the criteria of the Right to be Forgotten in the search engines cases under the GDPR

The right to be forgotten is regulated in Article 17 GDPR, which grants individuals the right to request, on certain grounds, erasure of their personal data.

“Right to be Forgotten”. A famous one over the last few years, right? The case known as Google Spain v. Costeja Gonzalez is the origin of this concept. Let’s proceed first with an overview of what happened. In 2010, a Spanish citizen requested Spanish a newspaper and Google (both Google Inc. and Google Spain) to remove the data concerning a confiscation order for his house, which had been fully resolved for several years though, but the information was still displayed and available when his name was entered in the Google search engine. Google appealed AEPD decision and the National High Court of Spain presented some questions to the European Court of Justice (CJEU) for a preliminary ruling. The CJEU ruled that a data subject may request the provider of an online search engine (“search engine provider”) to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful”.

Right to be Forgotten is regulated in Article 17 GDPR (“right to erasure”). EDPB guidelinesaim to interpret the Right to be Forgotten in the search engines cases in light of the provisions of Article 17.1 GDPR, together with article 21 GDPR, which grants the right to object and can serve as a legal basis for delisting requests.

When it comes to article 17.2. GDPR, EDPB just points out that “the statement by the Article 29 Working Party, saying that search engine providers “should not as a general practice inform the webmasters of the pages affected by de-listing of the fact that some webpages cannot be acceded from the search engine in response to specific queries” because “such communication has no legal basis under EU data protection law”, remains valid.

The grounds of the right to request delisting under GDPR

Article 17.1 sets out a general principle to erase the data in the six following cases:  

a. the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed (Article 17.1.a).

Within the context of the Right to request delisting, the key is the balance between the protection of privacy and the interests of Internet users in accessing to the information. The EDPB provides the following examples of scenarios where a data subject may exercise his or her right to request delisting pursuant to Article 17.1.a:  

information about a data subject held by a company have been removed from the public register;  
a link to a firm’s website contains the contact details of a data subject who is no longer working in that firm;
information had to be published on the internet for a number of years to meet a legal obligation and remained online longer than the time limit specified by the legislation.

b. the data subject withdraws consent on which the processing is based (Article 17.1.b).

The CJEU has interpreted that this clause is unlikely to apply when it comes to a delisting request because the controller to whom the data subject gave his or her consent is the web publisher, not the search engine operator that indexes the data. However, whenever a data subject withdraws his or her consent for the use of his or her data on a particular web page, the original publisher of that web page should inform search engine providers.

c. the data subject exercised his or her Right to object to processing of his or her personal data pursuant to Article 21.1 and 21.2 GDPR (Article 17.1.c).

Unlike the former Data Protection Directive, the GDPR does not impose on the data subject an obligation to demonstrate “compelling legitimate grounds” in order object to a processing “on grounds relating to his or her particular situation”. As a result, when a search engine provider receives a request to delist based on the data subject’s particular situation, it must now erase the personal data unless it can demonstrate “overriding legitimate grounds” for the listing of the specific search result. Therefore there is a need for assessing the particular situation of the data subject together with the classic criteria, such as his or her role in public life, how his or her privacy is affected, the nature of the information, whether the information has been verified, when the facts are dated, etc.

d. the personal data have been unlawfully processed (Article 17.1.d).

The notion of unlawful processing shall be interpreted both in view of Article 6 GDPR and also broadly, as the infringement of a legal provision other than the GDPR.

e. the erasure is compliant with a legal obligation (Article 17.1.e).

Compliance with a legal obligation may result from an injunction, an express request by national or EU law for being under a “legal obligation to erase” or the mere breach by the data controller of the retention period.

f. the personal data have been collected in relation to the offer of information society services to a minor (Article 17.1.f which refers to Article 8.1).

According to Directive 2000/31/EC, Information society services are not restricted to “services giving rise to on-line contracting but also, in so far as they represent an economic activity, extend to services which are not remunerated by those who receive them, such as those offering on-line information or commercial communications, or those providing tools allowing for search, access and retrieval of data”.

The exceptions to the right to request delisting under article 17.3 GDPR

Article 17.3 GDPR states that 17.1 and 17.2 GDPR will not apply when processing is necessary. However, according to the EDBP, those exceptions under Article 17.3 GDPR do not appear suitable in case of a delisting request, and such inadequacy pleads in favor of the application of Article 21 GDPR. Let’s see the reason why:

a. for exercising the right of freedom of expression and information (Article 17.3.a).

The CJEU recognised in the Costeja judgement and repeated recently in the Google 2 judgment that the processing carried out by a search engine provider can significantly affect the fundamental rights to privacy and data protection law when the search is performed using the name of a data subject. The Court also considered that the rights of the data subjects will prevail, in general, on the interest of Internet users in accessing information through the search engine provider. However, it identified several factors that may influence such determination, as for example: the nature of the information or its sensitivity, and especially the interest of Internet users in accessing information, an interest that can vary depending on the role played by the interested party in public life. This means that, depending on the circumstances of the case, search engine providers may refuse to delist a content, but they are required to be able todemonstrate that its inclusion in the list of results is strictly necessary for protecting the freedom of information of internet users.

b. for compliance with a legal obligation that requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller(Article 17.3.b).

In view of EDPB, the content of this exemption makes it difficult to apply to the activity of search engine providers as the processing of data by them is based, in principle, on the legitimate interest of the search engine provider.

c. for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9 (2) as well as Article 9 (3) (Article 17.3.c).

Considering the activity that search engine providers develop, which implies that they do not produce or present information, it is difficult to imagine the existence of legal provisions that oblige search engine providers to disseminate certain information instead of setting the obligation for that publication to be carried out in other web pages that will then be linked by search engine providers.

On another note, search engines providers are not public authorities and therefore do not exercise public powers by themselves. However, they could exercise those powers if they were attributed by the law of a Member State or of the Union. In the same way that they could carry out missions of public interest if their activity was considered necessary to satisfy that public interest in accordance with national legislation. Nonetheless, given the characteristics of search engine providers, it is unlikely that Member States will grant them public powers or consider that their activity or part of it is necessary for the achievement of a legally established public interest.

d. for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes in accordance with Article 89 (1) in so far as the right referred to in paragraph 1 is likely to render impossible or seriously impair the achievement of the objectives of that processing (Article 17.3.d).

These purposes must be objectively pursued by the search engine provider. The possibility that the suppression of results could significantly affect research purposes or statistical purposes pursued by users of the search engine provider’s service is not relevant for the application of this exemption. It should also be noted that these purposes may be objectively pursued by the search engine provider, without a link between the name of the data subject and the search results being necessary.

e. for the establishment, exercise or defense of legal claims (Article 17.3.e).

Delisting request supposes the suppression of certain results from the search results page that the search engine provider offers when the name of a data subject is normally used as search criteria. The information remains accessible using other search terms.

It is important to stress that, even though the above is focused on processing by search engine providers and delisting requests submitted by data subjects, Article 17and Article 21 GDPR are applicable to all data controllers.

Have you received any right to erasure request? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.  Contact us today.