The concept of “data exporter” clarified by the Danish DPA

In the light of the Schrems II judgment by the CJEU, questions relating to the concept of “data exporter” have been clarified by the Danish DPA. 

 

 Since the CJEU’s Schrems II judgment, the Danish Data Protection Agency has received an increasing number of questions relating to the transfer of personal data to third countries. Many of these questions are geared towards the concept of “data exporter” and who, in practice, is responsible for ensuring that the transfer of personal data takes place according to data protection regulations, especially regarding larger, complex data processing situations. While the term “data exporter” is not defined in the GDPR, the concept is defined in the EU Commission’s standard contract, which is one of the most widely used transfer bases in Chapter V of the GDPR. As a result, the Danish DPA has decided to provide clarification on the role and concept of a “data exporter.”

 

A data controller or processor in a third country to whom data is transferred under a standard contract is considered a “data importer.”

 

A standard contract can be entered into by an EU data controller who transfers personal data to a data controller or data processor in a third country. The third country data controller or processor would be considered the “data importer”. This situation has created a few doubts as to which party is responsible for ensuring the legality of the transfer under the GDPR, particularly in cases where one or more of the sub-data processors are outside the EU / EEA. 

 

The GDPR stipulates that both parties (whether exporter or importer) are responsible for establishing a legal basis for the transfer. 

 

According to GDPR Article 44, “Any transfer of personal data which are undergoing processing or are intended for processing after transfer to a third country or to an international organisation shall take place only if, subject to the other provisions of this Regulation, the conditions laid down in this Chapter are complied with by the controller and processor, including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation.” The Danish Data Protection Agency interprets this article of the GDPR to be applicable as an obligation for both the data controller and the data processor. Both parties are therefore obliged to ensure that a transfer basis is provided that is effective in the light of all the circumstances of the transfer. 

Under the GDPR, both the controller and processor are expected to take necessary measures to establish substantial security of the data. 

 

Article 32 of the GDPR states that the controller and the processor must establish an appropriate level of processing security. The Danish Data Protection Agency regards both the data controller and any potential data processors as independent subjects with regard to this obligation. This means that the data controller and the data processor are each expected to take the necessary technical and organizational measures to establish an appropriate level of processing security. In cases where the data processor provides most or all of the technical infrastructure, the task of the data controller is to ensure – and be able to demonstrate to the Danish DPA – that the data processor has established a satisfactory level of security for the data being processed.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

New data strategy introduced in the UK

New data strategy introduced in the UK to drive innovation and improve efficiency in the health sector. 

 

The UK recently announced a new data strategy for health data, which focuses on 7 principles to harness the data-driven power and innovation exhibited during the pandemic, and use it to improve the future of healthcare. These principles will be implemented to drive transformation in health care, and create a secure system for both patients and professionals, which prioritizes privacy. The principles set out in the data strategy include improving trust in the health care system’s use of data, giving health care professionals the information they need to provide the best care, improving data for adult social care, supporting local decision-makers with data, empowering researchers with the data they need to develop life-changing treatments and diagnostics, and working with partners to develop innovations that improve health care by developing the right technical infrastructure.

 

Secure data environments will be made the default for the health sector, and de-identified data will be used to perform research. 

 

In order to give patients the confidence that their personal information is safe, the NHS will make secure data environments the default, and adult social care organisations will provide access to de-identified data for research. As a result, data linked to a single individual will never leave a secure server, and de-identified will only be used for research purposes. This is expected to enable the delivery of cutting-edge life-saving treatments and quicker diagnosis through clinical trials, as well as more diverse and inclusive research to tackle health inequalities. The public will be consulted on a new ‘data pact’, which will set out how the healthcare system will use patient data and what the public has the right to expect from it. 

 

The new data strategy aims to digitize and improve several processes, providing ease to both patients and healthcare providers. 

 

The new data strategy introduced in the UK will also include some key commitments to patients, giving them greater access to and control over their data. This will incorporate the simplification of the opt-out process for data sharing and improved access to records via the NHS App. The strategy also commits to a target of 75% of the adult population to be registered to use the NHS App by March 2024, making it a one stop shop for health needs. This new data strategy aims to have at least 80% of social care providers to have a digitised care record in place by March 2024. 

 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Dark patterns in social media platform interfaces

Dark Patterns are defined by the European Data Protection Board (EDPB), as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”. These dark patterns seek to influence user behaviour on those platforms, hindering their ability to make conscious choices which effectively protect their personal data. Data protection authorities are responsible for sanctioning any use of these dark patterns if they breach GDPR requirements. These dark patterns include overloading, skipping, stirring, hindering, fickle designs and leaving users in the dark. 

Overloading

 Overloading refers to situations in which users are confronted with an unreasonable amount of requests, or too much  information, options or possibilities which encourages them to share more data that necessary or unintentionally allow personal data processing contrary to the expectations of the data subject. Overloading techniques include continuous prompting, privacy mazes and providing too many options. 

Skipping

 Skipping means designing the interface or user experience in a way that users forget or do not think about all or some of the data protection aspects. Examples of dark patterns which result in skipping include deceptive snugness and “look over there”.

Stirring

 Stirring is a dark pattern which affects the choice users would make by appealing to their emotions or using visual nudges. This includes emotional steering and pertinent information being “hidden in plain sight”. 

Hindering 

 Hindering refers to the obstructing or blocking of users becoming informed or managing their data by making the process extremely hard or impossible to achieve. The dark patterns are considered hindering include dead end designs, longer than necessary processes and misleading information. 

Fickle Interfaces

 Fickle interfaces are designed in an inconsistent and unclear manner, making it hard for the user to navigate the various data protection control tools and understand the purpose of the data processing. These interfaces include those lacking hierarchy as well as those which utilize decontextualising within the design. 

Interfaces that leave users in the dark  

 An interface  is considered to be leaving users in the dark if the interface is designed in a way that hides information or data protection control tools and leaves users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights. Examples of this include language discontinuity, conflicting information and ambiguous wording or information. 

We recently published a short vlog on YouTube outlining the types of dark patterns, how the GDPR principles, if adhered to, can prevent the design of your user interface from falling into these dark patterns, and what measures should be given special attention to avoid being sanctioned. 

Subscribe to Aphaia’s YouTube channel for more information on AI ethics and data protection. 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Bank fined by Hungarian SA for unlawful data processing

Budapest Bank fined by Hungarian SA, for unlawful data processing as the controller’s use of AI systems lacked transparency. 

 

Budapest Bank was recently fined by the Hungarian SA due to the fact that the data controller (Budapest Bank) performs automated analyses of customers’ satisfaction using AI on customer service phone calls. This data processing was not clearly specified to data subjects, and resulted in an investigation into the actions of the data controller last year reviewing its general data processing practice, specifically with regard to the automated analysis. The information revealed in the investigation resulted in a fine of approximately €650,000. 

 

Customers’ level of satisfaction was assessed from recorded calls using AI technology, without having informed data subjects of this processing. 

 

The data controller recorded all customer service calls which would be analysed on a daily basis. Using AI technology, certain keywords would be identified, to determine the emotional state of the customer in each recording. The result of this analysis was then stored and linked to the phone call and this stayed in the system of the software for 45 days. The point of this AI assessment is to compile a list of customers sorted by their likelihood of dissatisfaction based on the audio recording of the customer service phone call. Based on this data, designated employees are then expected to call clients, in an effort to assess their reasons for dissatisfaction. Data subjects received no communication regarding this processing, making it impossible for them to exercise their right to objection. 

 

Assessments showed that this processing posed a high risk to data subjects. 

 

While an impact assessment and legitimate interest assessment were performed and confirmed that the data processing posed a high risk to data subject rights, no action was taken to mitigate those risks. The data controller’s impact assessment confirmed that the data processing uses AI and poses a high risk to the fundamental rights of data subjects. Neither of the assessments performed provided any actual risk mitigation, and the measures which did exist on paper were insufficient and non-existent. Artificial intelligence is difficult to deploy in a transparent and safe manner, and therefore additional safeguards are necessary. It is typically difficult to confirm the results of personal data processing by AI, resulting in biased results.

 

The Hungarian SA ordered the controller to come into compliance and pay an administrative fine. 

 

The Hungarian SA determined this to be in serious infringement of several articles of the GDPR, and also considered the length of time over which these infringements persisted. The supervisory authority ordered the data controller to stop processing the emotional state of the clients, and to only continue the data processing if this processing can be made compliant with the GDPR. In addition to being ordered to come into compliance, the controller was issued an administrative fine of approximately €650,000.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.