New data strategy introduced in the UK

New data strategy introduced in the UK to drive innovation and improve efficiency in the health sector. 

 

The UK recently announced a new data strategy for health data, which focuses on 7 principles to harness the data-driven power and innovation exhibited during the pandemic, and use it to improve the future of healthcare. These principles will be implemented to drive transformation in health care, and create a secure system for both patients and professionals, which prioritizes privacy. The principles set out in the data strategy include improving trust in the health care system’s use of data, giving health care professionals the information they need to provide the best care, improving data for adult social care, supporting local decision-makers with data, empowering researchers with the data they need to develop life-changing treatments and diagnostics, and working with partners to develop innovations that improve health care by developing the right technical infrastructure.

 

Secure data environments will be made the default for the health sector, and de-identified data will be used to perform research. 

 

In order to give patients the confidence that their personal information is safe, the NHS will make secure data environments the default, and adult social care organisations will provide access to de-identified data for research. As a result, data linked to a single individual will never leave a secure server, and de-identified will only be used for research purposes. This is expected to enable the delivery of cutting-edge life-saving treatments and quicker diagnosis through clinical trials, as well as more diverse and inclusive research to tackle health inequalities. The public will be consulted on a new ‘data pact’, which will set out how the healthcare system will use patient data and what the public has the right to expect from it. 

 

The new data strategy aims to digitize and improve several processes, providing ease to both patients and healthcare providers. 

 

The new data strategy introduced in the UK will also include some key commitments to patients, giving them greater access to and control over their data. This will incorporate the simplification of the opt-out process for data sharing and improved access to records via the NHS App. The strategy also commits to a target of 75% of the adult population to be registered to use the NHS App by March 2024, making it a one stop shop for health needs. This new data strategy aims to have at least 80% of social care providers to have a digitised care record in place by March 2024. 

 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Bank fined by Hungarian SA for unlawful data processing

Budapest Bank fined by Hungarian SA, for unlawful data processing as the controller’s use of AI systems lacked transparency. 

 

Budapest Bank was recently fined by the Hungarian SA due to the fact that the data controller (Budapest Bank) performs automated analyses of customers’ satisfaction using AI on customer service phone calls. This data processing was not clearly specified to data subjects, and resulted in an investigation into the actions of the data controller last year reviewing its general data processing practice, specifically with regard to the automated analysis. The information revealed in the investigation resulted in a fine of approximately €650,000. 

 

Customers’ level of satisfaction was assessed from recorded calls using AI technology, without having informed data subjects of this processing. 

 

The data controller recorded all customer service calls which would be analysed on a daily basis. Using AI technology, certain keywords would be identified, to determine the emotional state of the customer in each recording. The result of this analysis was then stored and linked to the phone call and this stayed in the system of the software for 45 days. The point of this AI assessment is to compile a list of customers sorted by their likelihood of dissatisfaction based on the audio recording of the customer service phone call. Based on this data, designated employees are then expected to call clients, in an effort to assess their reasons for dissatisfaction. Data subjects received no communication regarding this processing, making it impossible for them to exercise their right to objection. 

 

Assessments showed that this processing posed a high risk to data subjects. 

 

While an impact assessment and legitimate interest assessment were performed and confirmed that the data processing posed a high risk to data subject rights, no action was taken to mitigate those risks. The data controller’s impact assessment confirmed that the data processing uses AI and poses a high risk to the fundamental rights of data subjects. Neither of the assessments performed provided any actual risk mitigation, and the measures which did exist on paper were insufficient and non-existent. Artificial intelligence is difficult to deploy in a transparent and safe manner, and therefore additional safeguards are necessary. It is typically difficult to confirm the results of personal data processing by AI, resulting in biased results.

 

The Hungarian SA ordered the controller to come into compliance and pay an administrative fine. 

 

The Hungarian SA determined this to be in serious infringement of several articles of the GDPR, and also considered the length of time over which these infringements persisted. The supervisory authority ordered the data controller to stop processing the emotional state of the clients, and to only continue the data processing if this processing can be made compliant with the GDPR. In addition to being ordered to come into compliance, the controller was issued an administrative fine of approximately €650,000.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Clearview fined by the ICO for unlawful data collection and processing

Clearview AI Inc was fined over £7.5 million, and ordered to delete photos and data of UK residents from its database. 

 

The ICO has fined Clearview AI Inc £7,552,800 for using the images of people, including those in the UK, that were scraped from the web and social media profiles to create their global online database which is geared towards facial recognition use. The enforcement notice issued by the ICO orders the company to stop collecting and using the personal data of UK residents, and to delete the data of any UK residents from its systems.

 

Clearview provides customers with a service which allows them to find information on an individual through their database,using facial recognition software. 

 

Clearview AI Inc has accumulated well over 20 billion images of faces and data of individuals all over the world from data that is publicly available on the internet and social media platforms, and used this data to create an online database. This database is intended to refine facial recognition software and practices. Internet users were uninformed about the collection and use of their images. The service provided by this company allows their customers, including the police, to upload an image of a person to the company’s app, which then compares the image to all the images in their database in order to find a match. This process typically results in the compilation of a list of images that have similar characteristics with the photo provided by the customer, and also includes a link to the websites from which those images were derived.

 

Clearview’s database likely includes a substantial amount of data from UK residents, which the UK Commissioner deems “unacceptable”.

 

Considering the volume of UK internet and social media users, it is quite likely that the company’s database includes a substantial amount of data from UK residents, which was collected without their knowledge. While Clearview has ceased offering its services to UK organisations, the company still has customers in other countries, and continues to use the personal data of UK residents, making their data available to those other international clients. In a statement from the ICO, John Edwards, UK Information Commissioner said “Clearview AI Inc has collected multiple images of people all over the world, including in the UK, from a variety of websites and social media platforms, creating a database with more than 20 billion images. The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”

 

The ICO found that the company breached UK data protection laws, which landed Clearview fined by the ICO. 

 

Through its investigation, the ICO found that Clearview AI used the information of people in the UK in a way that is neither fair nor transparent, considering the fact that individuals were not made aware, nor would not reasonably expect that their personal data was being used in such a way. The company also has no process in place to delete data after some time, to prevent the data they have collected from being used indefinitely. Clearview also failed to have a legal basis for the collection of all this data. The data collected by the company also falls into the class of special category data, which has higher data protection standards under the UK GDPR, and Clearview AI failed to meet those data protection standards. To make matters worse, when approached by members of the public seeking to exercise their right to erasure, the company required that they send additional personal information in order to have that request fulfilled, which may have acted as a deterrent to those individuals. These infractions landed Clearview fined by the ICO, a total of over £7.5 million. The company was also ordered to delete any data concerning UK residents from its database. 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Uber fined by Italian DPA for lack of transparency

A major privacy violation has landed Uber companies fined by Italian DPA, Garante, €2 million and €120,000 respectively. 

 

A privacy violation affecting over 1.5 million individuals has landed Uber two fines of 2 million and 120 thousand euros each, from Italian DPA Garante, according to this report. The company, Uber BV has a European office in Amsterdam, and Uber Technologies Inc (UTI), has a registered office in San Francisco. Both of these offices were held responsible for the privacy violation affecting over 1.5 million Italian users, including drivers and passengers. During an investigation carried out at Uber Italy following a privacy violation made public by the company’s US leader in 2017, the Italian DPA found Uber had committed several violations including processing data without consent, and failure to notify the Authority of a privacy violation. 

 

Uber had previously been fined by two other authorities in Europe for a similar violation. 

 

A privacy violation which occurred before the full application of the GDPR, resulted in Uber being fined by both the Dutch and UK authorities on the basis of their respective national regulations. The personal information processed by Uber included personal and contact information (name, phone number and email), app access credentials, location data, relationships with other users (sharing trips, introducing friends, profiling information).

 

The Italian DPA fined both Uber BV and Uber Technologies Inc for multiple privacy violations. 

 

In recent times, the Italian Authority has  sanctioned the Dutch company Uber BV and the US company Uber Technologies Inc, as joint controllers. Both companies were found responsible for violations of Europe’s privacy law affecting Italian users. The sanctions concern inadequate information given to users (the information related failed to communicate to the co-controllership of the data), which according to the Authority, was “formulated in a generic and approximate way” with “unclear and incomplete information” and “not easy to understand”. 

 

According to the Italian DPA, the purposes of the processing were not properly specified in the information, the references to the rights of the data subjects were vague and incomplete, and it was not clear whether or not users were obliged to provide their data, nor whether there were consequences to a possible denial. In addition, without having valid consent, the company processed the data of approximately 1,379 passengers, and went on to profile them on the basis of their so-called “fraud risk”. Finally, the company also failed to notify the Authority of the processing of data for geolocation purposes, as was required by the legislation which existed prior to the new GDPR. 

 

The Authority decided on two fines; one for €2 million and another for €120,000. 

 

In deciding on the amount of the fines, the Authority considered the seriousness of the violations, and also  the number of people affected as well as the economic conditions of the society. The Authority decided on two fines, with a total of  €2 million and €120 thousand euros to both Uber BV and Uber Technologies Inc. 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.