“The implementation of automated decision-making and profiling in business may be directly impacted by GDPR,” warns Aphaia Partner Cristina Contero Almagro. “You may need to conduct a Data Protection Impact Assessment.”
The use of algorithms and AI methods have implications for privacy. Aphaia’s experts have developed their own approach to assisting clients, from customer recommendation algorithms used in e-commerce to the assessment of X5GON, a global EU H2020-financed AI-based network of Open Educational Resources.
If you require assistance, please get in touch with the Aphaia team. We’re here to help.

GDPR requirements

…and more
Discrimination in practice – A closer look
Incorrect labelling of training data or an unbalanced training dataset may result in an even higher risk to the rights and freedoms of data subjects when it comes to the way algorithms work in practice. These implications are drafted in the diagram on the right and briefly summarised in the lines below.
Discrimination happens as a result of categorising individuals based on variable features, which means that an individual will be judged according to the group he or she fits into best. The use of AI by loan companies is a clear example of this: an individual may have a loan denied simply because he or she has been categorised into a group of people who don’t make a concerted effort to pay off all their debt, but that only really means that the variables assigned to such an individual are more similar to that group’s than other’s. It is for this reason that it’s important to define the variables properly- and assign suitable weight to each of them.
