Loading

Blog details

Biometrics researcher Christina-Angeliki Toli on GDPR biometrics challenges

Biometrics researcher Christina-Angeliki Toli on GDPR biometrics challenges

Christina-Angeliki Toli, data mining specialist currently working as a cryptographer/biometrics researcher engineer, talks to us about GDPR biometrics challenges including privacy-by design solutions that are not only secure but also respect the rights of their users.

Christina-Angeliki Toli GDPR biometrics

People often see increased processing of biometric data as a threat to privacy. In what way does your work help keeping personal biometric data safer?

Indeed the widespread use of biometric systems, the nature of the shared data, the use cases and applications mean privacy risks. Along these lines, every time that I have to design and analyse a biometric scheme, I try to address questions such as: “In what way can biometrics be considered as privacy friendly?”, “Could biometrics be a protection mechanism of individual privacy?” and “Can a biometric trait keep its source secret?” A pressing matter of contention is to study the systems from two separated complementary and conflicted angles. To do that, we combine the advantages of advanced cryptographic technologies for the pattern recognition field and secondly, we take into consideration the international standards that establish the sizes, configurations, protocols processes and tools. Thus, we manage to evaluate the best practices for privacy and data protection for the processing of biometric data by determining the performance in such a way that is convenient for both developers and users. For me the key is to mind the gap between the biometric template protection technologies and international biometric standards/rules/guidelines under legal requirements, by exploiting the factors of renewability, cancellability and revocability of biometrics.

What are the major privacy risks one should focus on when performing data protection impact assessment for projects involving biometric data?

This is something that always depends on the use case and the target group of users. Not all biometric deployments bear the same privacy risks, but specific features of biometric deployment increase or decrease privacy. Probably the most important privacy risks during the implementation of a biometric system is the storage of biometric data and any available user identity references. The concerns include re-use, unauthorized access or theft, over which the data subject has no control, and for these reasons, will equally be regarded as interfering with the fundamental rights of the data subject, in an excessive and disproportionate way. Another threat is considered to be the performance error rates. Researchers are mainly focused on identifying these systematic and statistical errors of the measurements in order to increase the performance, accuracy and security of the schemes. Last but not least, biometrics have become a natural fit for cloud-based applications, such as mobile devices log-in or e-payment systems. In such a setting, there are “authorized” parties that are involved to the verification or authentication phases. A biometric engineer should always take into account any possible malicious models and collusion cases of the intermediary entities and third parties, trying to suggest methodologies that can contribute to the strengthening of privacy and security.

Does the proposed new EU General Data Protection Regulation ( GDPR ) in your view adequately address biometric data? Here I especially refer to the new rules on privacy by design and default.

Clearly, the newly proposed EU General Data Protection Regulation ( GDPR ) adequately addresses biometric data necessities. As a tool, it is considered to be helpful for biometric engineers. Similarly, there are multiple updated frameworks that provide suggestions in assessing the privacy threats of biometric applications. In GDPR I can detect a precise work functionality and I could say an era of innovation. However, in my opinion, there are still some inadequacies making a separation of government and non-government biometric deployments. It is actually something that I have to face on a daily basis. For instance, having two schemes one for physical access control in a building and the second one for airport access control. There is no need to be a biometric engineer to see that neither requirements not regulations are the same. These disruptions affect the research, from innovation and implementation perspectives, but mostly bring negative impacts on the business sector. Although privacy by design can offer a solution and plays an important role in protecting privacy and data protection rights of individuals in biometric systems, we have to take more steps forward in order to achieve practically viable privacy-friendly secure biometric deployments.

Aphaia helps businesses in various industries adapt to new GDPR privacy rules that will start to apply in the UK and across Europe on 25th May 2018. We also act as external Data Protection Officers for businesses who will be required to have one as of the same date.

Prev post
GDPR-ready workshop con Talent Garden a Torino
May 24, 2017
Next post
GDPR adaptation – what does it comprise?
June 15, 2017

Leave a Comment