Bank fined by Hungarian SA for unlawful data processing
Budapest Bank fined by Hungarian SA, for unlawful data processing as the controller’s use of AI systems lacked transparency.
Budapest Bank was recently fined by the Hungarian SA due to the fact that the data controller (Budapest Bank) performs automated analyses of customers’ satisfaction using AI on customer service phone calls. This data processing was not clearly specified to data subjects, and resulted in an investigation into the actions of the data controller last year reviewing its general data processing practice, specifically with regard to the automated analysis. The information revealed in the investigation resulted in a fine of approximately €650,000.
Customers’ level of satisfaction was assessed from recorded calls using AI technology, without having informed data subjects of this processing.
The data controller recorded all customer service calls which would be analysed on a daily basis. Using AI technology, certain keywords would be identified, to determine the emotional state of the customer in each recording. The result of this analysis was then stored and linked to the phone call and this stayed in the system of the software for 45 days. The point of this AI assessment is to compile a list of customers sorted by their likelihood of dissatisfaction based on the audio recording of the customer service phone call. Based on this data, designated employees are then expected to call clients, in an effort to assess their reasons for dissatisfaction. Data subjects received no communication regarding this processing, making it impossible for them to exercise their right to objection.
Assessments showed that this processing posed a high risk to data subjects.
While an impact assessment and legitimate interest assessment were performed and confirmed that the data processing posed a high risk to data subject rights, no action was taken to mitigate those risks. The data controller’s impact assessment confirmed that the data processing uses AI and poses a high risk to the fundamental rights of data subjects. Neither of the assessments performed provided any actual risk mitigation, and the measures which did exist on paper were insufficient and non-existent. Artificial intelligence is difficult to deploy in a transparent and safe manner, and therefore additional safeguards are necessary. It is typically difficult to confirm the results of personal data processing by AI, resulting in biased results.
The Hungarian SA ordered the controller to come into compliance and pay an administrative fine.
The Hungarian SA determined this to be in serious infringement of several articles of the GDPR, and also considered the length of time over which these infringements persisted. The supervisory authority ordered the data controller to stop processing the emotional state of the clients, and to only continue the data processing if this processing can be made compliant with the GDPR. In addition to being ordered to come into compliance, the controller was issued an administrative fine of approximately €650,000.
Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.
- The risks associated with geolocation data: an assessment by LINC, CNIL - August 11, 2022
- CJEU ruling on special categories of personal data - August 9, 2022
- Fine imposed on Volkswagen by German Data Protection Commissioner for multiple GDPR violations - August 4, 2022