Non-transparent data checks by utility company result in a fine

Non-transparent data checks by an electric supply company have resulted in a fine from Hamburg DPA.

 

A recent fine from Hamburg DPA is the direct result of an electric supply company performing non-transparent data checks. The company was offering discounted sign up costs to first time customers, and as part of that process, performed data checks to verify whether customers signing up were indeed new, first time customers or whether they had previously held accounts. These data checks were not transparent as the company failed to inform customers that these checks were a part of their process. As a result the company was hit with a fine from Hamburg DPA. According to this release from the EDPB, a data check, or data comparison in and of itself is not illegal. However, the fact that customers were not informed that these checks would be performed resulted in a GDPR violation, as the company violated the transparency obligation under the GDPR.

 

The electric supply company was found to have violated Articles 12 and 13 of the GDPR.

 

The electric supply company, Vattenfall Europe Sales GmbH was found to have violated Articles 12 and 13 of the GDPR, after an assessment of their process by the Hamburg Commissioner for Data Protection and Freedom of Information. There were a total of around 500,000 people affected. Article 13 relates to the information which needs to be provided to a data subject at the time when data is collected. It states that, under Article 12 of the GDPR,“The controller shall take appropriate measures to provide any information referred to in Articles 13… relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language…”

 

The established violation and fine are not related to the processing itself, but the lack of transparency in communication with customers.

 

The fine, the corresponding violation and the eventual decision made this August by the Hamburg DPA, are not related to the actual data comparisons themselves as this, in and of itself, is not explicitly regulated by the GDPR. The company performed data checks comparing the data received from customer sign ups to customer data from previous years. This data had been stored according to tax and commercial law. The data checks were intended to prevent situations where customers may sign up and receive these bonus contracts repeatedly, resulting in this offer, which is meant to attract new customers, no longer being profitable for the company.The established illegality is limited to the insufficiently fulfilled transparency obligations to customers.

 

The company has accepted the fine of EUR 901,388.84 and ceased the non-transparent data comparison immediately after the DPA’s first action.

 

Vattenfall Europe Sales GmbH did not contest the fine, which amounted to EUR 901,388.84, and in fact immediately stopped performing the non transparent data checks once Hamburg DPA issued its initial decision. The company has cooperated fully with the Hamburg Commissioner and has agreed with the DPA on a manner of informing first time and existing customers about the data comparison and its purpose, in a transparent and comprehensive way. This will allow consumers to make an informed decision as to whether they want to apply for a discounted bonus contract, knowing that it includes an internal verification of their status as a new customer or a non-discounted contract which would not include this data comparison.

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

National Police Board of Finland reprimanded over data breach during facial recognition trial

The National Police Board of Finland has recently been reprimanded for unlawful processing during a facial recognition trial by the National Bureau of Investigation unit.

 

Finnish police have been reprimanded for the unlawful processing of special categories of personal data during a facial recognition technology trial. The National Bureau of Investigation unit which specializes in the prevention of child sexual abuse had experimented with facial recognition technology to aid in identifying potential victims in early 2020. The Bureau used facial recognition technology to aid with identifying possible victims of child sexual abuse with the US-based Clearview AI service. According to this report, the National Police Board was not aware of this trial, as the decision to try the software had been made independently by the National Bureau of Investigation unit.

 

The data controller, the National Police Board was held accountable for the data breach, and its inability to supervise this operation as the processor was not sufficiently informed of the protocol to special category data.

 

The National Police Board, in the capacity of the controller of the data processed by the police in Finland, informed the Office of the Data Protection Ombudsman in April 2021of the personal data breach. The National Bureau of Investigation unit made the decision to use Clearview AI, independent of the guidance of the National Police Board, and as a result the controller was uninformed and unable to approve and supervise the facial recognition technology trial. The investigative unit, after experimenting with the use of this technology in early 2020, deduced that the use of this technology was not suitable for Finnish authorities. In April of 2021, the National Police Board notified the Office of the Data Protection Ombudsman of the personal data breach involving the use of facial recognition technology during this trial period, after becoming aware of the situation through Buzzfeed News, an online media company from the US.

 

The National Police Board did not advise the National Bureau of Investigation unit

on the manner in which special categories of data should be handled, or on the process of lawfully going about the processing and as a result was held responsible for the personal data breach. The Act on the Processing of Personal Data in Criminal Matters and in Connection with Maintaining National Security puts the responsibility of lawful processing of personal data on the controller. In addition, under the GDPR, processors do not have the same compliance obligations as controllers. According to Article 24 of the GDPR, the data controller must actively demonstrate full compliance with all data protection principles. Therefore, the Data Protection Ombudsman has held the National Police Board accountable for this incident.

The National Police Board was reprimanded and ordered to take corrective action.

 

The National Police Board of Finland, as the controller was held accountable for the breach and reprimanded. The Data Protection Ombudsman has also ordered that the National Police Board notify all data subjects whose identities can be confirmed of the personal data breach. In addition, the Board must request that all data collected by Clearview AI be deleted from their storage database.

 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Series of injunctions issued by CNIL

A series of injunctions have been issued by CNIL of France, for the mismanagement of a database containing fingerprints.

 

The CNIL of France has recently issued a series of injunctions to a government ministry – the Ministry of the Interior, for the alleged illegal storage of data, poor file management, and a lack of information given to persons whose data is stored on their system. The Automated Fingerprints File, initiated in 1987, containing the fingerprints and handprints of various people implicated in investigations, had accumulated a sizable database of the prints of over 6.2 million people. Many of these files should have been deleted for various reasons.

 

CNIL has accused the Ministry of the Interior of storing data unlawfully, as well as keeping data stored well beyond its lawful retention period.

 

According to a Euractiv report, CNIL criticized the Ministry of the Interior last month, for storing data that was not provided for under the legislation. Depending on the gravity and the nature of an offense, this data may be stored for either 10, 15 or 25 years. In the event of an acquittal or dismissal of a case however, all fingerprints and data must be deleted. In 2019, at the time of the CNIL investigation into this government ministry, over 2 million records were being kept past their retention periods. In addition, several million manual files were being kept without a legal basis, despite digitization efforts over several years. The CNIL has asked that about 7 million manual files be deleted in spite of the fact that they had not surpassed their retention period.

 

The injunctions issued by CNIL also concerned matters of security and information dissemination.

 

One of the issues raised by the CNIL was that police were able to access the files containing the aforementioned biometric information as well as other personal information with a password of only 8 characters. This data was therefore deemed insufficiently secured by the privacy authority. In addition, according to the laws of France, individuals whose information is being processed must be informed on the purposes of, as well as the responsible party or parties for that processing. This information must be disseminated to the individuals either at the time of collection or at the time of the decision.

 

CNIL has given the Ministry of the Interior a timeframe to take corrective action for the series of injunctions issued.

 

As of July 2021, the State had notified CNIL that more than three million cards had been deleted in compliance with the rules of the retention periods. With regards to the manual files however, CNIL has rejected the suggested 4 year period for their destruction, stating that the age of the cards concerned, the duration of the breach and the nature of the data concerned, did not allow for that. CNIL asked that the physical filles be disposed of by 31st December, 2022. For all other matters of compliance, the CNIL has given a deadline of 31st December 2021.  According to the law, a fine cannot be imposed on the State.

 

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Data Governance Act agreed upon by Council of the EU

The Data Governance Act is expected to give the EU a competitive advantage in a world that is becoming increasingly data-driven.

 

A mandate for a Data Governance Act has recently been agreed upon by the Council of the EU, and this is expected to make data sharing easier, leading to several other benefits by extension. In an October 1st press release, the Council announced that Member States had agreed upon a negotiating mandate on a proposal for a Data Governance Act or DGA. This act is intended to promote certain data sharing mechanisms, like facilitating the reuse of certain categories of protected public-sector data, improving public confidence in data intermediation services and promoting data altruism.

 

The Data Governance Act will promote the reuse of public sector data while preserving privacy and confidentiality.

 

The Open Data Directive, which does not cover data from public sector bodies, will soon be complemented by the proposed DGA, allowing safer sharing of this category of data. The DGA will cover categories of public-sector data that are subject to the rights of others. This includes data protected by intellectual property rights, as well as trade secrets and personal data. The allowance of this manner of reuse will require the technical capabilities to maintain privacy and confidentiality. The Council’s stance on this will promote greater flexibility which respects any pre-existing national specificities of the various EU Member States.

 

The DGA is expected to foster the creation of a new business model – data intermediation.

 

Data intermediation services can help facilitate sharing by providing secure environments for companies and individuals to share data. This may take the form of digital platforms and would help individuals exercise their rights under the GDPR, while facilitating voluntary data sharing by companies. This may include features such as personal data spaces or data wallets, which would allow people to have full control over their data while sharing with companies that they trust.

 

Service providers will need to be kept in a register which individuals can refer to, to ensure that they are sharing with providers they can safely depend on. In addition these providers will only be able to use the data for the intended purposes. The data can not be sold or used for any alternative purpose. As part of their process the Council of the EU has clarified which types of companies can function as data intermediation service providers.

 

Data altruism is expected to be made more feasible by the Data Governance Act, allowing companies and individuals to share data for the common good.

 

For the purposes of research and other public interest, individuals and companies may want, or need to share data. The proposal for data governance is expected to make it easier to make data voluntarily available for these purposes. Organizations will be able to request to be registered to collect data for objectives of general interest. Organizations who register will be recognized across all EU Member States. The trust created by their being registered is expected to encourage individuals and companies to voluntarily share data with these organizations, and this data can then be used to benefit the wider society. There will also be a compliance code of conduct to which these organizations must adhere. This code of conduct will be created with the cooperation of data altruism organisations and relevant stakeholders.

 

 

A European Data Innovation Board will be created to ensure consistency in practice for all organizations involved.

 

The introduction of the DGA will usher in a new structure, called the European Data Innovation Board, which will be tasked with maintaining a level of consistency for organizations involved in the data sharing process. This Board will be expected to advise and assist the Commission in enhancing the interoperability of data intermediation services. In addition, it will ensure consistency in the processing of requests for public sector data. These changes are expected to all foster increased sharing by reassuring the public that data sharing can indeed be safe and maintain the protection of their rights and freedoms.

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today