Facial recognition and GDPR

Facial Recognition and GDPR

Facial recognition is growing by leaps and bounds so what of privacy and data protection? Today we take a deeper look at facial recognition and GDPR.

From surveillance to marketing; advances in technology has resulted in the commercialization—and some may even say normalization!—of facial recognition. Used by airports terminals, mobile phone makers and social media companies like Facebook, its likely that facial recognition has already touched your life in some way. In the months and years ahead we can certainly expect greater integration of this identification tool in society.

For instance, last month, South Wales Police began testing a new facial recognition app on officers’ mobile phones to help with suspect and vulnerable person identification. Also reports indicate that facial recognition will play a significant role during next year’s 2020 olympics.

Yet while trends indicate an upsurge of use of this seemingly convenient and exciting technology, it is not without privacy concerns.

As explained by data mining specialist and biometrics researcher engineer, Christina-Angeliki Toki, in an interview with Aphaia these privacy risks and concerns include: 

“Re-use, unauthorized access or theft, over which the data subject has no control, [therefore] interfering with the fundamental rights of the data subject, in an excessive and disproportionate way.” 

So as it relates directly to facial recognition and GDPR; what are the possible implications for entities which fail to implement necessary measures to negate privacy risks? Well for starters, administrative fines.

In one of our recent blogs we saw that a Swedish school had been fined 20.000 EUR for privacy and data protection infringements related to its use of facial recognition. 

Meanwhile in a September 4th statement, the ICO urged police forces, and private organizations alike, using “intrusive” facial recognition technology to be aware that existing data protection law and guidance still apply.

This is because facial recognition constitutes the use of biometric data—i.e a way to measure a person’s physical characteristics to verify their identity. Biometric data is therefore personal data which must be processed on a lawful basis in compliance with GDPR and the UK’s Data Protection Act.

While the cases above relate to public bodies using facial recognition, we should note that this technology is also widely common across private companies.

That said, what are the main GDPR requirements that businesses (and public bodies) implementing facial recognition should comply with?

• Identify a lawful basis for processing. Considering that biometric data is deemed as a special category

of personal data, the valid bases for this type of processing are quite limited.

• Implement appropriate security measures.

• Where the facial recognition system is as well used for automated decision making, additional safeguards should apply.

• Facial recognition is a new technology that processes special categories of personal data and may be used both for profiling and monitoring individuals in a publicly accessible area in large scale, among others, so it is mandatory to previously run a DPIA.

For additional insight on facial recognition and GDPR specific to your company’s operations and needs contact us today. Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Smart cities and privacy

Smart Cities and Privacy

At its core, smart cities involve tons of data and an intelligent network of connected devices transmitting this data. This creates big privacy challenges and risks.

Simply put, smart cities are described as a town, district or area which incorporates digital technology and data across all municipality functions in order to improve government services and enhance the way of life of its citizens..

Categories of data that a Smart City may collect are presented as follows:


Traffic data, waiting times, crowd management, smart cars, parking


Climate and weather, pollution, waste management

Citizens civil information

Census, elections, work

Education and health

School grades, absenteeism, number of doctor appointments/year, most common illnesses

Entertainment and consumption

Shops, theaters, cinemas (most/less popular ones, time and money citizens spent)

Security and surveillance

CCTV, police data

The collection, usage and interconnection of this level of data is exactly why Smart Cities creates big privacy challenges and risks, says Aphaia Partner, Cristina Contero.

Presenting at the 8th International Conference on Fibre Optics in Access Networks (FOAN2019) in Sarajevo last week, Cristina highlighted two significant data privacy issues:

Legitimate basis for processing;
and Security measures to protect information.
Identifying the legitimate basis to process data

While most of the data collected and used in smart cities will be aggregated data, Cristina says that there is a riskhigher so in small citiesthat individuals may be indirectly identified in smart cities due to the sheer amount of data and crossed sources.

How many citizens of 28 years, with a red car, who lives next to this particular neighborhood, have two small children and is diabetic there might be in a city with 20.000 inhabitants? Maybe not that much as we could imagine,she offered.

As a result, in the set up of Smart Cities, compliance with the GDPRs requirement for a lawful basis is essential.

According to the GDPR, there are six lawful bases:

(a) Consent.

(b) Performance of a contract.

(c) Legal obligation.

(d) Vital interests.

(e) Public interest.

(f) Legitimate interest.

According to Cristina, it is most likely that a governments legitimate basis regarding the set up of a smart city will fall under public interest.

Public interest can apply either when:

It is a specific task carried out in the public interest which is laid down by law; or
official authoritys activity (for example, a public bodys tasks, functions, duties or powers) which is laid down by law.

Cristina also explained that in order to rely on public interest the Government has to previously:

document the decision that the processing is necessary for them to perform a task in the public interest or exercise their official authority;
identify the relevant task or authority and its basis in common law or statute; and
include basic information about the purposes and lawful basis in the privacy notice.

Security Challenges

Big amounts of data, multiple stakeholders, and the gathering/sharing of data in real time are all privacy risk sources in Smart Cities.

To this end it is imperative that the economic resources to prevent or address security breaches are identified and secured even before a smart city is developed says Cristina.

Setting up an insecure Smart City structure will be much more costly in the long term than doing it properly from the very beginning. And if you do not have the resources to do it at the beginning, then do not do it.

In keeping with the GDPR, Governments will also have to implement technical and organizational measures to ensure a level of security appropriate to the risks.

Meanwhile, Cristina offered that the adoption of a three-layered security approach can go a long way in further helping Smart Cities secure their networks and prevent/minimize security breaches such as hacking.

Helpful security models include a layered approach, which features a system where all smart network devices have a unique identifying number and they operate within three layers of security:

data protection application for the server (to identify malicious content);

data scrutiny layer (as a firewall to protect servers); and

secure smart software for devices (to prevent malicious software from being installed on the devices).”

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.


Aphaia attends FOAN2019

8th International conference on Fiber Optics in Access Networks (FOAN2019) was held 2-4 September on Swissotel, Sarajevo.

More than 60 talks were delivered by the most top professionals in the Telecoms field, including people from Industry, Academy, Government and Regulatory Agencies during FOAN2019. The event brought together attendees from all around the world, from southern and eastern Europe, Japan and US, among others.

During the three whole days, the participants were able to share their thoughts and projects, plus enjoy two networking dinners in the beautiful old town of Sarajevo.

Special mention to Edvin Skaljo, who as a chair, together with the rest of the team, made this all possible.

The talks, focused on Fiber Optics in Access Networks, addressed the field from several different perspectives: IoT, Big Data, 5G, IP Rights, Data Protection and Smart Cities, etc. Some workshops and student demo sessionswere run in parallel in another room of the venue. Local TV and press came to FOAN2019 and documented the different activities and talks.

Aphaia was invited to deliver a speech and to moderate a Panel Discussion. Our Partner Cristina Contero Almagro offered a talk about Smart Cities and Privacy and alsochaired a Panel Discussion about Smart Cities and Regulation on the Day Three of FOAN2019.

She presented some of the main privacy and security challenges that Smart Cities are currently tackling, like the potential data breach risks and the need of identifying an adequate legitimate basis for the processing.

The guests of the Panel Discussion were Igor Jurcic, head of marketing Business group for VSE/SME; Tarik Hamzic, Vice President of Operations at Ministry of Programming in Bosnia and Herzegovina; Aleksandar Mastilovic, Expert Adviser to the Director General at Communications Regulatory Agency of Bosnia and Herzegovina and Aljo Mujcic, Professor at University of Tuzla (Bosnia), Faculty of electrical/engineering. They discussed three pillars of the Smart Cities’ Regulation: Smart Cities concept, Smart Cities and Academy and Smart Cities Investment.

FOAN2019 was an enriching experience we are very grateful to have been part of, and we are already looking forward for FOAN2020!


Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Voiceprinting Privacy

Voiceprinting Privacy

Voiceprinting is becoming a widespread identification and authentication tool for banks and even public authorities. Voiceprinting privacy concerns and GDPR compliance need to be discussed too.

As technology advances and the world shifts more and more towards electronic and online platforms, new means of digitally identifying individuals are constantly being introduced.

One such digital identification tool which has been experiencing a surge of use over the last three to five years is voiceprintingtechnology which authenticates individuals with voice alone.

In fact across the globe, several organizations including banks, credit unions and government agencies are already making use of this technology.

In 2016, for instance, Citi was reported to have launched a project to automatically verify a customers identity by voice within the first few seconds of the conversation. Citis adoption of voice printing was presented as a means of reducing time to service by eliminating the manual authentication processpotentially cutting a typical call center call by a minute or more.

Yet while voiceprint technology is being lauded as a security game-changer and a customer-service home run there are undoubtedly privacy and data protection concerns.

Just four months ago the Information Commissioners Office (ICO) issued a final enforcement notice to HM Revenue & Customs (HMRC) to delete millions of unlawful voiceprints after an investigation revealed that the UK tax office had collected biometric data without giving customers sufficient information about how their biometric data would be processed and had also failed to give customers the chance to give or withhold consent. The May 2019 final enforcement notice gave HRMC 28 days to complete the deletion of all biometric data held under the Voice ID system for which it does not have explicit consent.

“Since a voiceprint is regularly used to re-identify a person, it needs to be processed based on a lawful processing basis, just like any other personal data. This basis may be the individual’s consent or legitimate interest, subject to legitimate interest assessment in line with GDPR,” comments Dr Bostjan Makarovic, Aphaia managing partner.


Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.