H&M fined by HmbBfDI

H&M fined by HmbBfDI, over 35M Euro for data protection breaches.

H&M fined by the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI), over 35M Euro for data protection breaches.

H&M has been fined by the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI). H&M (Hennes & Mauritz), the popular clothing company, registered in Hamburg with a service center in Nuremberg and stores all over Europe and North America, has become the center of a security breach controversy. This has cost the brand a fine of over 35 million euros, as reported by the EDPB.

H&M interviewed their workforce about their personal lives, recording and storing excessive amounts of personal data.

The H&M Company has been operating this way for more than 6 years to this date in their service center in Nuremberg. They interviewed their employees extensively about their personal lives, recording everything, and storing all this information in their inside networks. Particularly following absences such as vacations and sick leave – even short absences, they would conduct long chats called “Welcome Back Talks”. In those meetings, they would investigate every detail concerning the activities of the employees during the absence. The supervisors recorded extensive data  including vacation experiences, but also symptoms of illness and diagnoses.

In addition to what was collected or recorded during those welcome back talks, the information the supervisors got out of their employees included information from casual hall conversations ranging from information on personal family issues to personal, political and religious beliefs. Some of this information would be used for evaluation of the development of the employee within the workplace, as well as to evaluate their efficiency.

This practice, which put the employees’ privacy at great risk came to light when  the data became accessible company-wide for several hours in October 2019 due to a configuration error. 

In October 2019, the details of these documents with personal information on individual employees became accessible for several hours. This was due to an internal error on the configuration of the company’s network. This event directly violates the employee’s civil rights by putting their personal and private information at risk. The Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI), upon becoming aware of the data breach through press reports, took this matter into their hands, and demanded that the contents of the network drive be frozen and subsequently handed over, and interviewed witnesses to confirm the company’s practices. H&M’s records consisted of around 60 gigabytes of data which they submitted for evaluation.

Following the hefty fine, H&M has taken full responsibility for the incident, apologized and is taking corrective measures.

The company was  issued a fine of 35,258,707.95 Euros for the violation of Prof. Dr. Johannes Caspar, Hamburg’s Commissioner for Data Protection and Freedom of Information, comments: “This case documents a serious disregard for employee data protection at the H&M site in Nuremberg. The amount of the fine imposed is therefore adequate and effective to deter companies from violating the privacy of their employees.”This should also serve as an example for other companies in how to operate and safeguard their employees’ private information if they wish to avoid similar situations.

The company presented HmbBfDI with a comprehensive concept of how data protection is to be implemented at the Nuremberg site from now on. Management has also expressly apologized to those affected, and offered employees considerable compensation for the breach. The newly introduced data protection concept includes a newly appointed data protection coordinator, monthly data protection status updates, increasingly communicated whistleblower protection and a consistent concept for dealing with data subjects’ rights of access.

“Data processing should be always subject to the existence of at least one lawful basis of those laid down in Article 6 GDPR. Records on religious beliefs and diagnoses merit even higher protection because they are special categories of data with restricted processing. This fine should serve as an example for other companies and it shows that no personal data processing is exempt from complying with the data protection regulation, including those operations that are limited to the internal networks” comments Cristina Contero Almagro, Partner in Aphaia.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling employee data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Brain Implants, GDPR and AI

Brain implants may be the next challenge for AI and GDPR. Our guest blogger Lahiru Bulathwela, LLM student in innovation and technology at the University of Edinburgh explores why and how.

What do brain implants, GDPR and AI share? Elon Musk recently demonstrated his brain-hacking implant on Gertrude the pig, which has begun to show a growth of interest within mainstream market consumption. Neural implants aren’t a recent development, they have been utilised by researchers and clinicians for several years; successful treatments with neural implants have helped patients who suffer from clinical depression, Parkinson’s and restoring limited movement for paralyzed individuals. The excitement for their development through companies like Neuralink is palpable, the potential for neural implants to treat individuals, and in the future, enhance people is certainly an interesting prospect. However, as with any innovative technology, the excitement of its development often overshadows concerns about its potential. For every person, the brain represents the total sum of your individuality and your identity, as such concerns surrounding neural implants are particularly sensitive.

Many potential obstacles face the development of neural implants, ranging from technological to physiological limitations. This blog will explore issues that relate to data protection as data protection is fundamentally central in our information dominated society, neural implants offer new challenges as the information it utilises is arguably, the most sensitive of data.

How do they work?

In the simplest terms, a neural implant is an invasive or non-invasive device that can interact with an individual’s brain. Certain devices can map an individual’s native neural activity and some devices can stimulate the neurons in a brain to alter functioning. While the technology is advanced, there are limitations to its efficacy, primarily surrounding our knowledge of native neural circuits. Currently, we can map an individual’s brain and record its neural activity but lack the knowledge to interpret that information in a meaningful way that would be expected of a consumer device. While limited at the moment, it is a question of when, rather than if, we will increase our understanding of native neural networks

GDPR and Neural implants

The GDPR is the most recent iteration of data protection regulation in the EU, and it sets a high regulatory standard. The GDPR is the most progressive data protection regulation to date but like other legislative tools, its development and implementation are in reaction to the rapidly evolving use of information in current society. Although the GDPR does not mention ‘neural information’ specifically within the definition of personal data in. Art.4(1), a person may be identified from said information, therefore it is personal data. :

“…’personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;”

Factors such as physical, physiological or mental could potentially be attributed to neural information. Any measured neural condition is personal data. Information gathered from a person’s brain is the most sensitive of personal information, especially when you consider if it is even possible to consent to such collection.

When we consent to our information being collected or shared, we have a degree of control of what information is made available. A data controller must justify the lawful collection of that data (Art.6 and Art.9), and ensure that consent is freely given, and not conditional for the provision of service or performance of a contract where the data is not necessary for the performance of that contract (Art.7(4)).

An individual can consent to share personal medical history with an insurer without having to share information on their shopping habits or their relationships. We can’t, however, limit that information within our brain, and neural activity cannot be hidden from a device that records electrical stimulation in the brain. The lack of control on what information our brain shares is a significant issue, despite the fact we are limited in our ability to interpret such information.

Future regulatory measures?

The GDPR is the most progressive regulatory instrument that has been implemented across the world, yet it lacks the necessary depth to deal with the ever-changing landscape of information gathering. The next iteration of data protection regulation requires the necessary foresight to effectively protect individuals from misappropriation of their information. Foresight to understand how overzealous regulation can hamper the progress of innovation, but ineffective regulation could hinder consumer confidence in neural implants.

When Elon Musk discussed his Neuralink implant as a “Fitbit in your skull” he minimised how invasive neural implants would be upon our privacy. While it could be said that individuals are more comfortable sharing their information in public fora through social media etc, there is still a choice of what information you choose to share. The lack of choice surrounding what information you provide through a neural implant necessitates the need for robust regulation; I predict that this regulation should include a requirement to use technology to enforce regulation.

Regulation through technology?

Governance using technology is a potential alternative to standard legislative tools. We have seen technologies such as Blockchain use cryptography to decentralises record keeping ensuring that information held on record is only accessible to verified individuals, and that no one single person or entity can access or control all the information.

More specific to neural implants would be the utilisation of machine learning in protecting an individual’s privacy. Machine learning is already being utilised in clinical environments to assist with brain mapping and it is possible that machine learning in the future will allow us to better understand our native neural networks. The use of machine learning to effectively regulate what information would be shared with a neural implant may seem far-fetched at this moment in time, but I would contest that this is due to a lack of understanding of our brains, rather than an algorithmic limitation. More research is required to understand what precisely can be interpreted from our neural activity, but we have the technological capability to create algorithms that could learn what information should and should not be shared.

Neural implants at present lack the sophistication to create any significant problems to an individual’s privacy, but they offer an opportunity for legislators and technologists to create proactive measures that would effectively protect individuals and build consumer confidence.

Check out our vlog exploring Brain Implants, GDPR and AI:

You can learn more about AI ethics and regulation in our YouTube channel.

Aphaia provides both GDPR, Data Protection Act 2018 and ePrivacy adaptation consultancy services, including data protection impact assessmentsCCPA compliance and Data Protection Officer outsourcing.

EDPB Guidelines on the targeting of social media users overview

On 2nd September, the EDPB adopted their Guidelines 8/2020 on the targeting of social media users, which aim to clarify the implications that these practices may have on privacy and data protection.

Most social media platforms allow their users to manage their privacy preferences by enabling the option to make their profiles public or private. Pictures, videos and text are not the only personal information processed in this context though: what about analytics used to target social media users? Analytics are also personal data and they should be managed and protected accordingly. The European Data Protection Board (EDPB) is aware of the risks this creates to the fundamental rights and freedoms of individuals and has published these guidelines to provide their recommendations with regard to the roles and responsibilities of targeters and social media providers.

Actors involved in social media targeting

The EDPB explains the concepts of social media providers, users and targeters as follows:

  • Social media providers should be understood as providers of online platforms that enable the development of networks and communities of users, among which information and content is shared.
  • Users are the individuals who are registered with the service and create accounts and profiles which data is used for targeting purposes. This term also comprises those individuals that access the services without having registered.
  • Targeters are defined as natural or legal persons that communicate specific messages to the users of social media in order to advance commercial, political, or other interests, on the basis of specific parameters or criteria.
  • Other actors who may be also relevant are marketing service providers, ad networks, ad exchanges, demand-side and supply-side platforms, data brokers, data management providers (DMPs) and data analytics companies.

Identifying the roles and responsibilities of the various actors correctly is key in the process, as the interaction between social media providers and other actors may give rise to joint responsibilities under the GDPR.

Risks to the rights and freedoms of users

The EDPB highlights some of the main risks that may be derived from social media targeting:

  • Uses of personal data that go against or beyond individuals’ reasonable expectations.
  • Combination of personal data from different sources.
  • Existence of profiling activities connected to targeting.
  • Obstacles to the individual’s ability to exercise control over his or her personal data.
  • Lack of transparency regarding the role of the different actors and the processing operations.
  • Possibility of discrimination and exclusion.
  • Potential possible manipulation of users and undue influence over them.
  • Political and ideological polarisation.
  • Information overload.
  • Manipulation over children’s autonomy and their right to development.
  • Concentration in the markets of social media and targeting.

Relevant case law

The EDPB analyses the respective roles and responsibilities of social media providers and targeters through the relevant case law of the CJEU, namely the judgments in Wirtschaftsakademie (C-210/16) and Fashion ID (C-40/17):

– In Wirtschaftsakademie, the CJEU decided that the administrator of a so-called “fan page” on Facebook must be regarded as taking part in the determination of the purposes and means of the processing of personal data. The reasoning behind this decision is that the creation of a fan page involves the definition of parameters by the administrator, which has an influence on the processing of personal data for the purpose of producing statistics based on visits to the fan page, using the filters provided by Facebook.

– In Fashion ID, the CJEU decided that a website operator can be a considered a controller when it embeds a Facebook social plugin on its website that causes the browser of a visitor to transmit personal data of the visitor to Facebook. However, the liability of the website operator will be “limited to the operation or set of operations involving the processing of personal data in respect of which it actually determines the purposes and means”, therefore the website operator will not be a controller for subsequent operations carried out by Facebook after the data has been transmitted.

Roles and responsibilities of targeters and social media providers

Social media users may be targeted on the basis of provided, observed or inferred data, as well as a combination thereof.

In most cases both the targeter and the social media provider will participate in determining the purpose (e.g. to display a specific advertisement to a set of individuals social media users who make up the target audience) and means (e.g. by choosing to use the services offered by the social media provider and requesting it to target an audience based on certain criteria, on the one hand and by deciding which categories of data shall be processed, which targeting criteria shall be offered and who shall have access, on the other hand) of the processing personal data, therefore they will be deemed to be joint controllers pursuant to the Article 26 GDPR.

As pointed out by the CJEU in Fashion ID, the joint controllership status will only extend to those processing operations for which the targeter and the social media provider effectively co-determine the purposes and means, such as the processing of personal data resulting from the selection of the relevant targeting criteria, the display of the advertisement to the target audience and the processing of personal data undertaken by the social media provider to report to the targeter about the results of the targeting campaign. However, the joint control does not extend to operations involving the processing of personal data at other stages occurring before the selection of the relevant targeting criteria or after the targeting and reporting has been completed.

The EDPB also recalls that actual access to personal data is not a prerequisite for joint responsibility, thus the above analysis would remain the same even if the targeter only specified the parameters of its intended audience and did not have access to the personal data of the affected users.

Legal bases of the processing

It is important to note that, as joint controllers, both the social media provider and the targeter must be able to demonstrate the existence of a legal basis pursuant to the Article 6 GDPR to justify the processing of personal data for which each of the joint controllers is responsible.

In general terms, the two legal basis that are more likely to apply are legitimate interest and data subject’s consent.

In order to rely on legitimate interest as the lawful basis, there are three cumulative conditions that should be met:

– (i) the pursuit of a legitimate interest by each the data controller or by the third party or parties to whom the data are disclosed;
– (ii) the need to process personal data for the purposes of the legitimate interests pursued, and
– (iii) the condition that the fundamental rights and freedoms of the data subject whose data require protection do not take precedence.

In addition, opt-out should be enabled in a manner that data subjects should not only be provided with the possibility to object to the display of targeted advertising when accessing the platform, but also be provided with controls that ensure the underlying processing of his or her personal data for the targeting purpose no longer takes place after he or she has objected.

Legitimate interest will not be suitable in some circumstances though, therefore consent will be required in those cases. Intrusive profiling and tracking practices for marketing or advertising purposes that involve tracking individuals across multiple websites, locations, devices, services or data-brokering would be some of the examples.

The EDPB further notes that the consent collected for the implementation of tracking technologies needs to fulfil the conditions laid out in Article 7 GDPR in order to be valid. They highlight that pre-ticked check-boxes by the service provider which the user must then deselect to refuse his or her consent do not constitute valid consent. Moreover, based on recital 32, actions such as scrolling or swiping through a webpage or similar user activity would not under any circumstances satisfy the requirement of a clear and affirmative action, because such actions may be difficult to distinguish from other activity or interaction by a user, which means that determining that an unambiguous consent has been obtained would also not be possible. Furthermore, in such a case, it would be difficult to provide a way for the user to withdraw consent in a manner that is as easy as granting it.

The controller that should be in charge of collecting the consent from the data subjects will be the one that is involved first with them. This is because consent, in order to be valid, should be obtained prior to the processing. The EDPB also recalls that the controller gathering consent should name any other controllers to whom the data will be transferred and who wish to rely on the original consent.

Finally, where the profiling undertaken is likely to have a “similarly significant [effect]” on a data subject (for example, the display of online betting advertisements), Article 22 GDPR shall be applicable. An assessment in this regard will need to be conducted by the controller or joint controllers in each instance with reference to the specific facts of the targeting.

The EDPB welcomes comments to the Guidelines until 19th October.

You can learn more about joint controllership in our recent blog Joint controllership: key considerations by the EDPB.

 

Are you targeting social media users? You may need to adapt your processes to comply with the GDPR and the EDPB Guidelines. We can help you. Aphaia provides both GDPR, Data Protection Act 2018 and ePrivacy adaptation consultancy services, including data protection impact assessments, CCPA compliance and Data Protection Officer outsourcing.

joint controllership

Joint controllership: key considerations by the EDPB

The EDPB provides key considerations to clarify the concepts of processor, controller and joint controller in their Guidelines 07/2020.

The European Data Protection Board (EDPB) published their Guidelines 07/2020 on the concepts of controller and processor in the GDPR on 7th September, which aim to offer a precise meaning of these concepts and a criteria for their correct interpretation that is consistent throughout the European Economic Area.

Since the CJEU considered, in its Judgment in Fashion ID, C-40/17, the fashion retailer Fashion ID to be a controller jointly with Facebook by embedding the ‘Like’ button in its website, the concept of joint controllership seems to have a broader meaning, as it may apply now to some data processing that were deemed otherwise in the past.

In our blog today we go through the main insights provided by the EDPB with regard to the concept of joint controller.

The concept of joint controller in the GDPR

Pursuant to the Article 26 of the GDPR, the qualification as joint controller may arise where two or more controllers jointly determine the purposes and means of processing. The GDPR also states that the actors involved shall determine their respective responsibilities for compliance by means of an arrangement between them, whose essence shall be made available to the data subjects. However, the GDPR does not contain further provisions that specify the details around this type of processing, such as the definition of ‘jointly’ or the legal form of the arrangement.

Joint participation

The EDPB explains that joint participation can take the form of a common decision taken by the two or more actors involved in the processing or result from converging decisions by them. Thus in practice, joint participation can take several different forms and it does not require the same degree of involvement or equal responsibility by the controllers in each case.

  • Joint participation through common decision. It means deciding together and involves a common intention.
  • Joint participation through converging decisions. This one results from the case law of the CJEU on the concept of joint controllers. According to the GDPR, the requirements the decisions should meet to be considered as converging on purposes and means are the following:
    • They complement each other.
    • They are necessary for the processing to take place in such manner that they have a tangible impact on the determination of the purposes and means of the processing.

As a result, the question that should be contemplated to identify converging decisions would be along the lines of “Would the processing be possible without both parties’ participation in the sense that the processing by each party is inseparable?”.

The EDPB also highlights that the fact that one of the parties does not have access to personal data processed is not sufficient to exclude joint controllership.

 

Jointly determined purpose(s)

The EDPB considers that there are two scenarios under which the purpose pursued by two or more controllers may be deemed as jointly determined:

  • The entities involved in the same processing operation process such data for jointly defined purposes.
  • The entities involved pursue purposes which are closely linked or complementary. Such may be the case, for example, when there is a mutual benefit arising from the same processing operation, provided that each of the entities involved participates in the determination of the purposes and means of the relevant processing operation.

Jointly determined means

Joint controllership requires that two or more entities have exerted influence over the means of the processing. However, this does not mean that each entity involved needs in all cases to determine all of the means. There might be different circumstances which would qualify as joint controllership where the rest of requirements are met, even where the determination of the means is not equally shared between the parties, for example:

 

  • Different joint controllers define the means of the processing to a different extent, depending on who is effectively in a position to do so.
  • One of the entities involved provides the means of the processing and makes it available for personal data processing activities by other entities. The entity who decides to make use of those means so that personal data can be processed for a particular purpose also participates in the determination of the means of the processing. For example, the choice made by an entity to use for its own purposes a tool or other system developed by another entity, allowing the processing of personal data, will likely amount to a joint decision on the means of that processing by those entities.

 

Limits of joint controllership

The fact that several actors are involved in the same processing does not mean that they are necessarily acting as joint controllers of such processing. Not all kind of partnerships, cooperation or collaboration imply qualification of joint controllers as such qualification requires a case-by-case analysis of each processing at stake and the precise role of each entity with respect to each processing. The EDPB provides a non-exhaustive list of examples of situations where there is no joint controllership:

  • Preceding or subsequent operations: while two actors may be deemed joint controllers with regard to a specific data processing where the purpose and means of its operations are jointly determined, this does not affect the purposes and means of operations that precede or are subsequent in the chain of processing. In that case, the entity that decides alone should be considered as the sole controller of said preceding or subsequent operation.
  • Own purpose: the situation of joint controllers acting on the basis of converging decisions should be distinguished from the case of a processor, since the latter, while participating in the performance of a processing, does not process the data for its own purposes but carries out the processing on behalf of the controller.
  • Commercial benefit: the mere existence of a mutual benefit arising from a processing activity does not give rise to joint controllership. For example, if one of the entities involved is merely being paid for services rendered, it is acting as a processor rather than as a joint controller.

For instance, the use of a common data processing system or infrastructure will not in all cases lead to qualify the parties involved as joint controllers, in particular where the processing they carry out is separable and could be performed by one party without intervention from the other or where the provider is a processor in the absence of any purpose of its own. Another example would be the transmission of employee data to tax authorities.

Joint controller arrangement

Joint controllers should put in place a joint controller arrangement where they determine and agree on, in a transparent manner, their respective responsibilities for compliance with the GDPR. The following list of non-exhaustive tasks should be specified by means of said arrangement:

  • Response to data subjects requests exercised pursuant to the rights granted by the GDPR.
  • Transparency duties to provide the data subjects with the relevant information referred in Articles 13 and 14 GDPR.
  • Implementation of general data protection principles.
  • Legal basis of the processing.
  • Security measures.
  • Notification of a personal data breach to the supervisory authority and to the data subject.
  • Data Protection Impact Assessments.
  • The use of a processor.
  • Transfers of data to third countries.
  • Organisation of contact with data subjects and supervisory authorities.

The EDPB recommends documenting the relevant factors and the internal analysis carried out in order to allocate the different obligations. This analysis is part of the documentation under the accountability principle.

When it comes to the form of the arrangement, even if there is no legal requirement in the GDPR for a contract or other legal act, the EDPB recommends that such arrangement be made in the form of a binding document such as a contract or other legal binding act under EU or Member State law to which the controllers are subject.

The EDPB welcomes comments to the Guidelines until 19th October.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both adaptation consultancy services, including data protection impact assessments, CCPA compliance and Data Protection Officer outsourcing.