Facial recognition payment system launched in Moscow

Facial recognition payment system recently launched in Moscow is the first to be used on such a large scale.

 

Commuters in Moscow, Russia now have the option of using the new voluntary payment method called “Face Pay”. This method allows them to sign up by scanning their face in front of a camera at a designated turnstile. Prior to this commuters had an option available to submit their card details along with their photo identification. Moscow’s transport department has stated that any information stored on commuters to allow them to use Face Pay is encrypted securely. The facial recognition payment system is totally voluntary as all other payment systems also continue to be used.

 

Moscow is the first city to operate a facial recognition payment system on such a scale.

 

The city of Moscow is home to one of the world’s largest video surveillance systems. Facial recognition technology has been used there to enforce COVID-19 quarantines, as well as  to make preventive arrests and detentions. While Moscow is no stranger to the use of facial recognition, as the city was also said to have used facial recognition technology during the 2018 football World Cup in an effort to find wanted criminals, this is the first time the city is using a facial recognition payment system. In addition, the city is the first to use facial recognition technology for a payment system on such a large scale. According to this article from Reuters, Maxim Liksutov, head of the Russian capital’s transport department, said in a statement recently “Moscow is the first city in the world where this system is operating on such a scale.”

 

Moscow’s new facial recognition payment system has received backlash particularly citing privacy concerns.

 

The new facial recognition payment system in Moscow has already received some backlash. Digital rights groups are claiming that the system could undermine the privacy and human rights of the population. Roskomsvoboda, a digital rights and freedom of information organisation, has warned that a facial recognition payment system could possibly be used for surveillance purposes. The city’s transport department maintains that commuters’ data would be encrypted securely. Since the payment system will be collecting biometric data, it is important that this data remain well protected.

 

Would this be allowed in Europe?

The GDPR does not prohibit the use of special category data like fingerprints or facial recognition data, however there are guidelines for use which serve the purpose of protecting, not only user data, but also the rights and freedoms of individuals. As we informed in our blog in June, the EDPB and the EDPS called for a general ban on any use of AI for automated recognition of human features in publicly accessible spaces, in any context.

Cristina Contero Almagro, Partner in Aphaia, points out that “Considering the nature of the processing and the data involved, this type of activity would require both a Data Protection Impact Assessment and AI Ethics Assessment in order to identify and minimise the risks for the rights and freedoms of the data subjects. Together with the purpose limitation principle, ensuring that the transparency obligations are met would also be paramount for the processing to be lawful and several mitigation measures would need to be implemented, such as short storage times and the right to obtain human intervention”.

 

Do you need additional insight on facial recognition and GDPR specific to your company’s operations? Aphaia can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Non-transparent data checks by utility company result in a fine

Non-transparent data checks by an electric supply company have resulted in a fine from Hamburg DPA.

 

A recent fine from Hamburg DPA is the direct result of an electric supply company performing non-transparent data checks. The company was offering discounted sign up costs to first time customers, and as part of that process, performed data checks to verify whether customers signing up were indeed new, first time customers or whether they had previously held accounts. These data checks were not transparent as the company failed to inform customers that these checks were a part of their process. As a result the company was hit with a fine from Hamburg DPA. According to this release from the EDPB, a data check, or data comparison in and of itself is not illegal. However, the fact that customers were not informed that these checks would be performed resulted in a GDPR violation, as the company violated the transparency obligation under the GDPR.

 

The electric supply company was found to have violated Articles 12 and 13 of the GDPR.

 

The electric supply company, Vattenfall Europe Sales GmbH was found to have violated Articles 12 and 13 of the GDPR, after an assessment of their process by the Hamburg Commissioner for Data Protection and Freedom of Information. There were a total of around 500,000 people affected. Article 13 relates to the information which needs to be provided to a data subject at the time when data is collected. It states that, under Article 12 of the GDPR,“The controller shall take appropriate measures to provide any information referred to in Articles 13… relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language…”

 

The established violation and fine are not related to the processing itself, but the lack of transparency in communication with customers.

 

The fine, the corresponding violation and the eventual decision made this August by the Hamburg DPA, are not related to the actual data comparisons themselves as this, in and of itself, is not explicitly regulated by the GDPR. The company performed data checks comparing the data received from customer sign ups to customer data from previous years. This data had been stored according to tax and commercial law. The data checks were intended to prevent situations where customers may sign up and receive these bonus contracts repeatedly, resulting in this offer, which is meant to attract new customers, no longer being profitable for the company.The established illegality is limited to the insufficiently fulfilled transparency obligations to customers.

 

The company has accepted the fine of EUR 901,388.84 and ceased the non-transparent data comparison immediately after the DPA’s first action.

 

Vattenfall Europe Sales GmbH did not contest the fine, which amounted to EUR 901,388.84, and in fact immediately stopped performing the non transparent data checks once Hamburg DPA issued its initial decision. The company has cooperated fully with the Hamburg Commissioner and has agreed with the DPA on a manner of informing first time and existing customers about the data comparison and its purpose, in a transparent and comprehensive way. This will allow consumers to make an informed decision as to whether they want to apply for a discounted bonus contract, knowing that it includes an internal verification of their status as a new customer or a non-discounted contract which would not include this data comparison.

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

National Police Board of Finland reprimanded over data breach during facial recognition trial

The National Police Board of Finland has recently been reprimanded for unlawful processing during a facial recognition trial by the National Bureau of Investigation unit.

 

Finnish police have been reprimanded for the unlawful processing of special categories of personal data during a facial recognition technology trial. The National Bureau of Investigation unit which specializes in the prevention of child sexual abuse had experimented with facial recognition technology to aid in identifying potential victims in early 2020. The Bureau used facial recognition technology to aid with identifying possible victims of child sexual abuse with the US-based Clearview AI service. According to this report, the National Police Board was not aware of this trial, as the decision to try the software had been made independently by the National Bureau of Investigation unit.

 

The data controller, the National Police Board was held accountable for the data breach, and its inability to supervise this operation as the processor was not sufficiently informed of the protocol to special category data.

 

The National Police Board, in the capacity of the controller of the data processed by the police in Finland, informed the Office of the Data Protection Ombudsman in April 2021of the personal data breach. The National Bureau of Investigation unit made the decision to use Clearview AI, independent of the guidance of the National Police Board, and as a result the controller was uninformed and unable to approve and supervise the facial recognition technology trial. The investigative unit, after experimenting with the use of this technology in early 2020, deduced that the use of this technology was not suitable for Finnish authorities. In April of 2021, the National Police Board notified the Office of the Data Protection Ombudsman of the personal data breach involving the use of facial recognition technology during this trial period, after becoming aware of the situation through Buzzfeed News, an online media company from the US.

 

The National Police Board did not advise the National Bureau of Investigation unit

on the manner in which special categories of data should be handled, or on the process of lawfully going about the processing and as a result was held responsible for the personal data breach. The Act on the Processing of Personal Data in Criminal Matters and in Connection with Maintaining National Security puts the responsibility of lawful processing of personal data on the controller. In addition, under the GDPR, processors do not have the same compliance obligations as controllers. According to Article 24 of the GDPR, the data controller must actively demonstrate full compliance with all data protection principles. Therefore, the Data Protection Ombudsman has held the National Police Board accountable for this incident.

The National Police Board was reprimanded and ordered to take corrective action.

 

The National Police Board of Finland, as the controller was held accountable for the breach and reprimanded. The Data Protection Ombudsman has also ordered that the National Police Board notify all data subjects whose identities can be confirmed of the personal data breach. In addition, the Board must request that all data collected by Clearview AI be deleted from their storage database.

 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Series of injunctions issued by CNIL

A series of injunctions have been issued by CNIL of France, for the mismanagement of a database containing fingerprints.

 

The CNIL of France has recently issued a series of injunctions to a government ministry – the Ministry of the Interior, for the alleged illegal storage of data, poor file management, and a lack of information given to persons whose data is stored on their system. The Automated Fingerprints File, initiated in 1987, containing the fingerprints and handprints of various people implicated in investigations, had accumulated a sizable database of the prints of over 6.2 million people. Many of these files should have been deleted for various reasons.

 

CNIL has accused the Ministry of the Interior of storing data unlawfully, as well as keeping data stored well beyond its lawful retention period.

 

According to a Euractiv report, CNIL criticized the Ministry of the Interior last month, for storing data that was not provided for under the legislation. Depending on the gravity and the nature of an offense, this data may be stored for either 10, 15 or 25 years. In the event of an acquittal or dismissal of a case however, all fingerprints and data must be deleted. In 2019, at the time of the CNIL investigation into this government ministry, over 2 million records were being kept past their retention periods. In addition, several million manual files were being kept without a legal basis, despite digitization efforts over several years. The CNIL has asked that about 7 million manual files be deleted in spite of the fact that they had not surpassed their retention period.

 

The injunctions issued by CNIL also concerned matters of security and information dissemination.

 

One of the issues raised by the CNIL was that police were able to access the files containing the aforementioned biometric information as well as other personal information with a password of only 8 characters. This data was therefore deemed insufficiently secured by the privacy authority. In addition, according to the laws of France, individuals whose information is being processed must be informed on the purposes of, as well as the responsible party or parties for that processing. This information must be disseminated to the individuals either at the time of collection or at the time of the decision.

 

CNIL has given the Ministry of the Interior a timeframe to take corrective action for the series of injunctions issued.

 

As of July 2021, the State had notified CNIL that more than three million cards had been deleted in compliance with the rules of the retention periods. With regards to the manual files however, CNIL has rejected the suggested 4 year period for their destruction, stating that the age of the cards concerned, the duration of the breach and the nature of the data concerned, did not allow for that. CNIL asked that the physical filles be disposed of by 31st December, 2022. For all other matters of compliance, the CNIL has given a deadline of 31st December 2021.  According to the law, a fine cannot be imposed on the State.

 

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.