Spanish DPA launched Pacto Digital

Spanish DPA launched Pacto Digital, a digital pact for data protection

The Spanish DPA launched Pacto Digital, a digital pact for data protection with the support of over 40 organizations. 


The Pacto Digital initiative by the AEPD was officially presented to the public on January 28th, Data Protection Day at a virtual event called “The Forum on Privacy, Innovation and Sustainability. This event was streamed live, with several state, business and media officials in attendance. This initiative is part of the Spanish DPA’s Social Responsibility and Sustainability Framework with the aim of raising awareness, making data protection compatible with innovation and fostering a commitment to privacy among organizations. The principles of this pact promote transparency, giving citizens a greater awareness of what data is being collected and why. The initiative also promotes gender and race equality and ensures the protection of children and other vulnerable persons. It promotes and supports innovation by ensuring that technological advancements avoid perpetuating biases, particularly based on race, origin, belief, religion and gender. 


The digital pact initiative launched by the AEPD consists of three documents; a contract, a digital responsibility pledge and a code of conduct. 


Organisations which subscribe to this digital pact, would all sign a contract, showing their commitment to implementing the recommendations of the pact within their organisation. In addition, these organisations will commit to giving their employees and users access to the Priority Channel to request the urgent removal of sexual or violent content online, as well as other key tools and resources to help raise awareness on the importance of privacy and personal data. 


As part of this initiative, the Spanish DPA has also introduced a Digital Responsibility Pledge containing obligations which the organisations pledge to keep. This is not intended to give subscribing organisations additional responsibilities outside of the legislature to which they’re already held. This pledge is simply tailored to the digital environment and geared towards getting a specific commitment from these organisations to uphold the standard. It outlines the already existing responsibilities of organisations specifically geared towards safety and privacy online. It also incorporates principles that should be considered to ensure that the ethics of data protection remain intact when designing and implementing new technological developments. 


Finally, the code of conduct for good privacy practices is geared towards organisations with their own dissemination channels  and the media, both of which the AEPD intends to collaborate with to report issues of relevance to their networks and audiences. In addition, the code of conduct states that these organisations commit to refraining from identifying victims of the dissemination of sensitive content or punishing any information which could possibly identify them, particularly regarding public figures. 


Forty organisations signed the pact on January 28th, however other interested parties may apply online. 


On January 28th, the 40 organisations who already form part of this pact, made their commitment to the principles of data protection and privacy publicly known by signing this agreement. This digital pact is open to any organisations that wish to assume those commitments reflected in the contract. Interested organisations may apply online showing their commitment publicly, and promising to commit to the principles outlined in this pact. 


Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

Spanish DPA AEPD

Spanish DPA AEPD publishes Guidelines on AI audits

AEPD, the Spanish data protection authority, has published Guidelines on the requirements that should be implemented for conducting audits of data processing activities that embed AI.

Early this month, the Spanish DPA, AEPD, published Guidelines on the requirements that should be considered when undertaking audits of personal data processing activities which involve AI elements. The document addresses the special controls to which the audits of personal data processing activities comprising AI components should be subject.

Audits are part of the technical and security measures regulated in the GDPR and they are deemed essential for a proper protection of personal data. The AEPD Guidelines contain a list of audit controls among which the auditor can select the most suitable ones, on a case by case basis, depending on several factors such as the way the processing may affect GDPR compliance, the type of AI component used, type of data processing and the risks to the rights and freedoms of the data subjects that the processing activities pose.

Special features of AI audits methodology

The AEPD remarks that the audit process should be governed by the principles laid down in the GDPR, namely: lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality and accountability.

The AEPD also points out that all the controls listed in the Guidelines are not meant to be applied together. The auditor should select those ones that are relevant based on the scope of the audit and the goals it pursues.

What type of data processing do these requirements apply to and who should comply with them?

The Guidelines will be applicable where:

  • There are personal data processing activities at any stage of the AI component lifecycle; or
  • The data processing activities aim to profile individuals or make automated decisions which produce legal effects concerning the data subjects or similarly significantly affects them.

The AEPD states that in some cases it might be useful to carry out some preliminary assessments before moving forward with the audit, such as, inter-alia, an assessment of the level of anonymisation of personal data, an assessment of the risk of re-identification and an assessment of the risk of losing data stored in the cloud.

The document is especially addressed to data controllers who audit personal data processing activities that include components based on AI, to data processors and developers who wish to offer additional guarantees around their products and services, to DPOs responsible for monitoring the data processing and providing advice to the data controllers and to auditors who work with this type of processing.

Control goals and actual controls

The main body of the Guidelines consists of five audit areas that are broken down into several objectives containing the actual controls among which the auditors, or the person in charge of the process as relevant, can make their selection for the specific audit they are undertaking.

The AEPD provides an exhaustive list comprising more than a hundred of controls, which are summed up in the following paragraphs. 

  • AI component identification and transparency

This area includes the following objectives: inventory of the AI components, definition of responsibilities, and transparency.

The AEPD stresses the importance of keeping full records both of the components, -including, inter alia, ID, version, date of creation and previous versions- and the persons in charge of the process -such as their contact details, roles and responsibilities-. There are also some provisions with regard to the information that should be available to the stakeholders, especially when it comes to the data sources, the data categories involved, the model and the logic behind the AI component, and the accountability mechanisms.

  • AI component purpose

There are several objectives within this area: identification of the AI component purposes, uses and context, proportionality and necessity assessment, data recipients, data storage limitation and analysis of the data subject categories.

The controls linked to these objectives are based on the standards and requirements needed to achieve the desired outcomes and the elements that may affect said result, as for example the conditioning factors, the socioeconomic conditions, and the allocation of tasks, among others, for which a risk assessment and a DPIA are recommended.

  • AI component basis

This area is built over the following objectives: identification of the AI component development process and basic architecture, DPO involvement, adequacy of the theoretical models and methodological framework.

The controls defined in this section are mainly related to the formal elements of the process and the methodology followed. They aim to ensure the interoperability between the AI component development process and the privacy policy, to define the requirements that the DPO should meet and guarantee their proper involvement in a timely manner and to set out the relevant revision procedures.

  • Data management

The AEPD details four objectives in this area: data quality, identification of the origin of the data sources, personal data preparation and bias control. 

Whereas data protection is the ‘leitmotiv’ along the Guidelines, it is specially present in this chapter, which covers, inter alia, data governance, variables and proportionality distribution, lawful basis for processing, reasoning behind the selection of data sources and data and variables categorisation.

  • Verification and validation

Seven objectives are pursued in this area: verification and validation of the AI component, adequacy of the verification and validation process, performance, coherence, robustness, traceability and security. 

The controls set out in this area focus on ensuring data protection compliance for the ongoing implementation and use of the AI component, looking for guarantees around the existence of a standard which allows for verification and validation procedures once the AI component has been integrated, a schedule for internal inspections, an analysis of false positives and false negatives, a procedure to find anomalies and mechanisms for identifying unexpected behaviour, among others.

Final remarks

The AEPD concludes with a reminder of the fact that the Guidelines contain a data protection approach to the audit of AI components, which means, on the one hand, that it may need to be combined with additional controls derived from other perspectives and, on the other hand, that not all controls will be relevant in each case, as they should be selected according to the specific needs, considering the type of processing, the client’s requirements, and the specific features of the audit and its scope, together with the results of the risk assessment.

Does your company use AI? You may be affected by EU future regulatory framework. We can help you. Aphaia provides both GDPR and DPA 2018 adaptation consultancy services, including data protection impact assessmentsEU AI Ethics assessments and Data Protection Officer outsourcingContact us today.

AEDP approved first BCR

The AEPD has approved its first Binding Corporate Rules (BCRs) under the GDPR

The Spanish DPA, AEPD, has approved its first Binding Corporate Rules (BCRs) under the GDPR. The AEPD acted as lead DPA and counted with the EDPB’s favourable Opinion.

The AEPD has issued their final opinion concerning the first binding corporate rules drafted by Fujikura Automotive Europe Group, two months after the EDPB approved them. This will be included in the register of decisions which have been subject to the consistency mechanism, and it means that Fujikura Automotive Europe Group will be free to use, from now onwards, the BCRs for transferring personal data to the group members based in third countries with appropriate safeguards. 

What are BCRs?

GDPR defines Binding Corporate Rules as “personal data protection policies which are adhered to by a controller or processor established on the territory of a Member State for transfers or a set of transfers of personal data to a controller or processor in one or more third countries within a group of undertakings, or group of enterprises engaged in a joint economic activity”.

Once approved by the competent DPA, BCRs are considered a valid instrument that provides appropriate safeguards for personal data transfers to third countries.

What is the approval process of BCRs?

First, the lead DPA confirms whether the draft BCRs include all article 47.2 GDPR mandatory requirements. Then, pursuant the consistency mechanism covered in articles 63 and 64.1 GDPR, the EDPB should issue their opinion, after which the lead DPA communicate their final decision and, where approved, BCRs are included in the relevant register.

How did the process apply to this case?

Pursuant to Recital 110 GDPR, “a group of undertakings should be able to make use of approved binding corporate rules for its international transfers from the Union to organisations within the same group”, as long as said BCRs include “all essential principles and enforceable rights to ensure appropriate safeguards for transfers”. 

Back to this case, the BCRs were first drafted by Fujikura Automotive Europe Group and the AEPD reviewed them as the Lead DPA. Accordingly, the AEPD submitted its draft decision to the EDPB, who, early this year, issued their opinion, by which they considered that the BCRs contained appropriate safeguards to ensure that the level of protection of natural persons guaranteed by the GDPR was not undermined when transferring and processing personal data to and by the group members based in third countries. Two months after, the AEPD has finally approved them and communicated their final decision to the EDPB.

Do you need assistance with the appropriate safeguards that should apply to international transfers of personal data? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.  Contact us today.