Facial recognition payment system launched in Moscow

Facial recognition payment system recently launched in Moscow is the first to be used on such a large scale.

 

Commuters in Moscow, Russia now have the option of using the new voluntary payment method called “Face Pay”. This method allows them to sign up by scanning their face in front of a camera at a designated turnstile. Prior to this commuters had an option available to submit their card details along with their photo identification. Moscow’s transport department has stated that any information stored on commuters to allow them to use Face Pay is encrypted securely. The facial recognition payment system is totally voluntary as all other payment systems also continue to be used.

 

Moscow is the first city to operate a facial recognition payment system on such a scale.

 

The city of Moscow is home to one of the world’s largest video surveillance systems. Facial recognition technology has been used there to enforce COVID-19 quarantines, as well as  to make preventive arrests and detentions. While Moscow is no stranger to the use of facial recognition, as the city was also said to have used facial recognition technology during the 2018 football World Cup in an effort to find wanted criminals, this is the first time the city is using a facial recognition payment system. In addition, the city is the first to use facial recognition technology for a payment system on such a large scale. According to this article from Reuters, Maxim Liksutov, head of the Russian capital’s transport department, said in a statement recently “Moscow is the first city in the world where this system is operating on such a scale.”

 

Moscow’s new facial recognition payment system has received backlash particularly citing privacy concerns.

 

The new facial recognition payment system in Moscow has already received some backlash. Digital rights groups are claiming that the system could undermine the privacy and human rights of the population. Roskomsvoboda, a digital rights and freedom of information organisation, has warned that a facial recognition payment system could possibly be used for surveillance purposes. The city’s transport department maintains that commuters’ data would be encrypted securely. Since the payment system will be collecting biometric data, it is important that this data remain well protected.

 

Would this be allowed in Europe?

The GDPR does not prohibit the use of special category data like fingerprints or facial recognition data, however there are guidelines for use which serve the purpose of protecting, not only user data, but also the rights and freedoms of individuals. As we informed in our blog in June, the EDPB and the EDPS called for a general ban on any use of AI for automated recognition of human features in publicly accessible spaces, in any context.

Cristina Contero Almagro, Partner in Aphaia, points out that “Considering the nature of the processing and the data involved, this type of activity would require both a Data Protection Impact Assessment and AI Ethics Assessment in order to identify and minimise the risks for the rights and freedoms of the data subjects. Together with the purpose limitation principle, ensuring that the transparency obligations are met would also be paramount for the processing to be lawful and several mitigation measures would need to be implemented, such as short storage times and the right to obtain human intervention”.

 

Do you need additional insight on facial recognition and GDPR specific to your company’s operations? Aphaia can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Non-transparent data checks by utility company result in a fine

Non-transparent data checks by an electric supply company have resulted in a fine from Hamburg DPA.

 

A recent fine from Hamburg DPA is the direct result of an electric supply company performing non-transparent data checks. The company was offering discounted sign up costs to first time customers, and as part of that process, performed data checks to verify whether customers signing up were indeed new, first time customers or whether they had previously held accounts. These data checks were not transparent as the company failed to inform customers that these checks were a part of their process. As a result the company was hit with a fine from Hamburg DPA. According to this release from the EDPB, a data check, or data comparison in and of itself is not illegal. However, the fact that customers were not informed that these checks would be performed resulted in a GDPR violation, as the company violated the transparency obligation under the GDPR.

 

The electric supply company was found to have violated Articles 12 and 13 of the GDPR.

 

The electric supply company, Vattenfall Europe Sales GmbH was found to have violated Articles 12 and 13 of the GDPR, after an assessment of their process by the Hamburg Commissioner for Data Protection and Freedom of Information. There were a total of around 500,000 people affected. Article 13 relates to the information which needs to be provided to a data subject at the time when data is collected. It states that, under Article 12 of the GDPR,“The controller shall take appropriate measures to provide any information referred to in Articles 13… relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language…”

 

The established violation and fine are not related to the processing itself, but the lack of transparency in communication with customers.

 

The fine, the corresponding violation and the eventual decision made this August by the Hamburg DPA, are not related to the actual data comparisons themselves as this, in and of itself, is not explicitly regulated by the GDPR. The company performed data checks comparing the data received from customer sign ups to customer data from previous years. This data had been stored according to tax and commercial law. The data checks were intended to prevent situations where customers may sign up and receive these bonus contracts repeatedly, resulting in this offer, which is meant to attract new customers, no longer being profitable for the company.The established illegality is limited to the insufficiently fulfilled transparency obligations to customers.

 

The company has accepted the fine of EUR 901,388.84 and ceased the non-transparent data comparison immediately after the DPA’s first action.

 

Vattenfall Europe Sales GmbH did not contest the fine, which amounted to EUR 901,388.84, and in fact immediately stopped performing the non transparent data checks once Hamburg DPA issued its initial decision. The company has cooperated fully with the Hamburg Commissioner and has agreed with the DPA on a manner of informing first time and existing customers about the data comparison and its purpose, in a transparent and comprehensive way. This will allow consumers to make an informed decision as to whether they want to apply for a discounted bonus contract, knowing that it includes an internal verification of their status as a new customer or a non-discounted contract which would not include this data comparison.

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

National Police Board of Finland reprimanded over data breach during facial recognition trial

The National Police Board of Finland has recently been reprimanded for unlawful processing during a facial recognition trial by the National Bureau of Investigation unit.

 

Finnish police have been reprimanded for the unlawful processing of special categories of personal data during a facial recognition technology trial. The National Bureau of Investigation unit which specializes in the prevention of child sexual abuse had experimented with facial recognition technology to aid in identifying potential victims in early 2020. The Bureau used facial recognition technology to aid with identifying possible victims of child sexual abuse with the US-based Clearview AI service. According to this report, the National Police Board was not aware of this trial, as the decision to try the software had been made independently by the National Bureau of Investigation unit.

 

The data controller, the National Police Board was held accountable for the data breach, and its inability to supervise this operation as the processor was not sufficiently informed of the protocol to special category data.

 

The National Police Board, in the capacity of the controller of the data processed by the police in Finland, informed the Office of the Data Protection Ombudsman in April 2021of the personal data breach. The National Bureau of Investigation unit made the decision to use Clearview AI, independent of the guidance of the National Police Board, and as a result the controller was uninformed and unable to approve and supervise the facial recognition technology trial. The investigative unit, after experimenting with the use of this technology in early 2020, deduced that the use of this technology was not suitable for Finnish authorities. In April of 2021, the National Police Board notified the Office of the Data Protection Ombudsman of the personal data breach involving the use of facial recognition technology during this trial period, after becoming aware of the situation through Buzzfeed News, an online media company from the US.

 

The National Police Board did not advise the National Bureau of Investigation unit

on the manner in which special categories of data should be handled, or on the process of lawfully going about the processing and as a result was held responsible for the personal data breach. The Act on the Processing of Personal Data in Criminal Matters and in Connection with Maintaining National Security puts the responsibility of lawful processing of personal data on the controller. In addition, under the GDPR, processors do not have the same compliance obligations as controllers. According to Article 24 of the GDPR, the data controller must actively demonstrate full compliance with all data protection principles. Therefore, the Data Protection Ombudsman has held the National Police Board accountable for this incident.

The National Police Board was reprimanded and ordered to take corrective action.

 

The National Police Board of Finland, as the controller was held accountable for the breach and reprimanded. The Data Protection Ombudsman has also ordered that the National Police Board notify all data subjects whose identities can be confirmed of the personal data breach. In addition, the Board must request that all data collected by Clearview AI be deleted from their storage database.

 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Data Governance Act agreed upon by Council of the EU

The Data Governance Act is expected to give the EU a competitive advantage in a world that is becoming increasingly data-driven.

 

A mandate for a Data Governance Act has recently been agreed upon by the Council of the EU, and this is expected to make data sharing easier, leading to several other benefits by extension. In an October 1st press release, the Council announced that Member States had agreed upon a negotiating mandate on a proposal for a Data Governance Act or DGA. This act is intended to promote certain data sharing mechanisms, like facilitating the reuse of certain categories of protected public-sector data, improving public confidence in data intermediation services and promoting data altruism.

 

The Data Governance Act will promote the reuse of public sector data while preserving privacy and confidentiality.

 

The Open Data Directive, which does not cover data from public sector bodies, will soon be complemented by the proposed DGA, allowing safer sharing of this category of data. The DGA will cover categories of public-sector data that are subject to the rights of others. This includes data protected by intellectual property rights, as well as trade secrets and personal data. The allowance of this manner of reuse will require the technical capabilities to maintain privacy and confidentiality. The Council’s stance on this will promote greater flexibility which respects any pre-existing national specificities of the various EU Member States.

 

The DGA is expected to foster the creation of a new business model – data intermediation.

 

Data intermediation services can help facilitate sharing by providing secure environments for companies and individuals to share data. This may take the form of digital platforms and would help individuals exercise their rights under the GDPR, while facilitating voluntary data sharing by companies. This may include features such as personal data spaces or data wallets, which would allow people to have full control over their data while sharing with companies that they trust.

 

Service providers will need to be kept in a register which individuals can refer to, to ensure that they are sharing with providers they can safely depend on. In addition these providers will only be able to use the data for the intended purposes. The data can not be sold or used for any alternative purpose. As part of their process the Council of the EU has clarified which types of companies can function as data intermediation service providers.

 

Data altruism is expected to be made more feasible by the Data Governance Act, allowing companies and individuals to share data for the common good.

 

For the purposes of research and other public interest, individuals and companies may want, or need to share data. The proposal for data governance is expected to make it easier to make data voluntarily available for these purposes. Organizations will be able to request to be registered to collect data for objectives of general interest. Organizations who register will be recognized across all EU Member States. The trust created by their being registered is expected to encourage individuals and companies to voluntarily share data with these organizations, and this data can then be used to benefit the wider society. There will also be a compliance code of conduct to which these organizations must adhere. This code of conduct will be created with the cooperation of data altruism organisations and relevant stakeholders.

 

 

A European Data Innovation Board will be created to ensure consistency in practice for all organizations involved.

 

The introduction of the DGA will usher in a new structure, called the European Data Innovation Board, which will be tasked with maintaining a level of consistency for organizations involved in the data sharing process. This Board will be expected to advise and assist the Commission in enhancing the interoperability of data intermediation services. In addition, it will ensure consistency in the processing of requests for public sector data. These changes are expected to all foster increased sharing by reassuring the public that data sharing can indeed be safe and maintain the protection of their rights and freedoms.

 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today