AEPD published guidelines

AEPD published guidelines on data protection and labor relations

AEPD published guidelines on data protection and labor relations in collaboration with the Ministry of Labor and the employers and trade union organizations. 

The AEPD published guidelines recently, aiming at offering a practical tool to aid public and private organizations in upholding their compliance with the legislation in place. The agency collaborated with the Ministry of labor and social economy and the employers and trade union organizations in order to prepare this guide. The guide is centered around compliance with the GDPR and the DPA with specific focus on updates regarding the rights of workers and the collection and use of their data by employers. This guide covers quite a range of issues including employee data protection within the organization and even employer access to social media profiles, internal whistleblowing and privacy for victims and alleged harassers in the workplace. 

The guidelines outline the general bases of legitimate data processing by employers.

The guidelines from the AEPD proposes the data protection rights to be upheld in a working environment. In the document the AEPD addresses the importance of applying the principle of data minimisation. An employment contract does not automatically give employers access to any and all personal information of employees, therefore these guidelines outline what information may or may not be necessary. The document sets the limits for the processing of data in the hiring process, as well as throughout the course of the employment contract. The AEPD explains that due to the duties of secrecy and security, personal data should only be known by the affected party and by those uses within the organization who have the power to use, consult or modify the data.

The AEPD suggests using the least invasive system possible for tracking employee working days. 

According to the guidelines published by the AEPD, with regards to tracking employee workdays, the least invasive system possible should be adopted. This information cannot be publicly accessible or located in a visible place. In addition, the data registered by these systems must not be used for any purposes other than the tracking of the working day. In the example of a worker who travels to perform their role, a working day tracker would be used for the sole purpose of recording when their workday begins and ends, and not to constantly monitor  their location. The processing of geolocation data requires a specific legal basis. 

The guidelines cover access by employers to social media profiles and data from wearable technology like smart watches. 

The AEPD explains that employees are not obligated to allow their employer to access or inquire into their social media profiles. This includes during the hiring process as well as for the execution of the employment contract. Even in cases where a candidate for employment has a social media profile that is publicly accessible, an employer may not process any data obtained in that way, unless there is a valid legal basis for it. In this case it will be necessary for the employer to inform the worker and to demonstrate what the legal basis is including its relevance to the performance of the role. 

The AEPD published guidelines on wearable devices, particularly on the monitoring of health data through devices like smart watches. In general this type of monitoring is prohibited for several reasons. This type of monitoring violates the principle of proportionality as it suggests the constant monitoring of special category data (health), and could allow employers to access data specific to health conditions and not exclusively the data assessing an individual’s ability to perform their job.

The AEPD published guidelines specific to internal whistleblowing and privacy for victims and alleged perpetrators. 

In instances of gender-based violence, or harassment, personal data, particularly identity is generally considered to be special category data. Sensitive data of this nature requires enhanced protection. According to the guidelines, an identification code should be assigned to the alleged victim as well as the alleged perpetrator in these cases. When it is necessary to process data for compliance to legal obligations, an employer may process data of a worker regarding their condition as it relates to gender-based violence or harassment. In cases of harassment at work, both the identity of the alleged harasser and the alleged victim of harassment must be protected. 

The guidelines state that the works council now has the right to information on the parameters of a company’s algorithms and artificial intelligence systems.

As the use of artificial intelligence becomes more prevalent, the guide includes groundbreaking information on the rights of the works council to be informed by companies, on the framework for any algorithms or AI systems used within their company. This includes explanations on profiles which could prossible affect access to, as well as conditions, and maintenance of employment. This condition was newly introduced into law (RD-law 9/2021), modifying the Workers’ Statute, and introducing an additional level of transparency to the process. 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and national data protection legislation in handling employee data? Aphaia provides ePrivacy, GDPR and data protection consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

Facebook loses challenge

Facebook loses challenge as court rules in favor of DPC

Facebook loses challenge as court rules in favor of DPC’s draft decision for an inquiry and suspension of Facebook’s data transfers to the US. 

Following the Schrems II judgement of last July, the Irish Data Protection Commission, launched an inquiry into Facebook Ireland Ltd, and suspended the company’s EU-US data flows. Facebook disagreed with, and decided to challenge the decision. The company asserted that the DPC’s decision, and the procedures subsequently adopted are susceptible to judicial review. This long standing legal battle over Facebook Ireland’s right to continue making data transfers to the US, has now come to an end. This ruling, affirming Ireland’s lead regulator’s decision to suspend their EU-US data flows is likely to have major effects on Facebook’s operations. 

This decision is the culmination of an eight year battle, initiated by a 2013 complaint from Mr. Max Schrems.

Facebook Ireland, a subsidiary of the US company Facebook Inc, provides the social networks Facebook and Instagram to the European region, and houses its central administration and European headquarters in Dublin. In June 2013, Mr Maximilian Schrems filled a complaint with the DPC regarding the transfer of his personal data to the US by Facebook Ireland, claiming that it was unlawful under national and EU law, and in October 2013, the DPC stated that the matter would be “investigated promptly with all due diligence and speed”. In May 2016, the DPC wrote to Facebook Ireland and Mr Schrems with a draft decision that Standard Contractual Clauses could not lawfully be relied upon in respect to transfers of EU citizens’ personal data to the US. After this judgment, in July 2020, the CJEU gave a judgment. The court ruled that according to the GDPR, EU residents whose personal data is transferred to a third country using Standard Contractual Clauses must be afforded the same level of protection guaranteed within the European Union and the GDPR. Since the authorities in the United States cannot be bound by Standard Contractual Clauses, data transferred there may not be effectively protected. As a result of last year’s judgment, the Irish DPC launched an inquiry, and came to a preliminary decision to halt Facebook’s data transfers to the US, a decision that was subsequently challenged by Facebook. 

Facebook challenged the draft decision by the DPC claiming that they should have awaited guidance from the EDPB. 

Facebook challenged the draft decision, as well as the inquiry, claiming that the Data Protection Commission should have waited for guidance from the European Data Protection Board before proceeding with an inquiry and ordering suspension of its data transfers. The company asserted that as a member of the EDPB, the DPC would have received imminent guidance from the EDPB, and should not have acted prior to receiving that. This guidance was eventually published in November 2020, and as of May 14th 2021, the High Court has ruled that Facebook Ireland “ has not established any basis for impugning the DPC decision or the PDD of the procedures for the inquiry adopted by the DPC.” The judge rejected claims by Facebook that the DPC was in breach of its duty in how the case was handled. Justice David Barnaville also stated however, that the DPC should have responded to certain questions that Facebook raised in their October 2020 correspondence.

Facebook loses challenge as high court ruling gives the Irish DPC the right to open a second “own volition“ investigation against Facebook.

This long standing battle has now come to an end, resulting in an inevitable suspension of Facebook’s data transfers to the US. A second, “own volition” investigation has also been opened and is running simultaneously with the original complaint dating back to 2013, which led to the CJEU’s “Schrems II” decision. Regarding Facebook’s appeal of the DPC’s decision, the High Court, in its 127 page document outlining its judicial review of this case, rejected Facebook’s claims against the DPC. Eight years after the initial complaint, it is now certain that the DPC will have to act to stop Facebook‘s EU-US data transfers. This decision is likely to heavily impact Facebook’s operations. Regardless, the company said it looked forward to defending its compliance to the Data Protection Commission.

Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcing.  Contact us today.

The CNIL issues it’s opinion

The CNIL issues it’s opinion on vaccine passes for mass gatherings

The CNIL issues it’s opinion on the implementation and use of vaccine passes for admittance to mass crowd events in France. 

 

 As the world aims to resume somewhat normal activity during the global COVID-19 pandemic, France is considering the use of the vaccine passes or  green passes for admission to mass gatherings of at least 1000 persons. This suggestion comes in an effort to re-open certain establishments and resume certain activities, while minimizing the risk of contamination from the virus. These green passes, as with the ones for travel, will include information related to the COVID-19 vaccine, a negative COVID-19 test, or proof of recovery from the virus. While they were originally developed to facilitate travel with more ease during the pandemic, the Government of France seeks to take the opportunity to use them for access to mass crowd events, in an effort to resume those activities much sooner. 

 

The CNIL makes it clear that these passes are not to be used beyond the health crisis. 

 

The CNIL wishes that it be made clear that these passes are intended only for use during the pandemic and it will definitely be of a temporary nature. In acknowledging the unprecedented nature of an initiative like this and the implications that it may have for the lives of individuals, the Authority wants it to be made clear that this measure is meant for the specific purpose of dealing with the current health crisis and should only be used for as long as its purpose is applicable to the COVID-19 pandemic. In addition, the CNIL requests that the impact of this system on the health situation be monitored, studied and documented at regular intervals and on the basis of objective data, in order to determine whether public authorities should continue its use. 

 

The CNIL would like guarantees that the use of these passes is limited to mass crowd events. 

 

While the authority acknowledges the functionality of these passes for admittance into mass crowd events, CNIL would like to make it clear that in the interest of respect for the fundamental rights and freedoms of persons, these passes should be limited to those mass crowd events for which they are intended. The Authority wants to ensure that the use of these passes excludes places that relate to the daily activities of the population like restaurants, workplaces, shops, etc. In addition these passes should not be used for admission to any venue linked to certain usual manifestations of fundamental freedoms (in particular the freedom to demonstrate, to organize political or trade unionists and to freedom of religion). The CNIL notes that the particular exclusion of these passes and the prohibition of their use in these spheres is likely to minimize any implications of the use of this system on the rights and freedoms of individuals. CNIL also believes that there should be further clarification and transparency on the qualification of the events where the use of these passes would be considered appropriate, and measures ensuring that the passes are not used in places and events which do not meet those qualifications. 

 

The CNIL would like to ensure that the use of these passes does not result in discrimination, and protects the personal data of individuals. 

 

In order to avoid discrimination, the CNIL is stressing the need that these passes be accessible to all. This includes ensuring that passes are available on paper as well as in digital format. It is also important to ensure that there is no discrimination based on the type of evidence presented in these passes, whether it be evidence of vaccination, a negative COVID-19 test, or recovery from the virus. Due to the sensitive nature of the information used for these passes, it is very important to make special considerations for limiting the disclosure of health information of individuals. The CNIL therefore suggests the implementation of a solution which would make it possible to limit access to persons authorized to verify the certificates. In addition, the Authority believes that these verifications should result in a color code (green or red color), along with the identity of their holder, so as not to reveal whether the individual has been vaccinated, tested, or recovered from a previous infection with COVID-19.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

New EU law

New EU law imposes a time limit on tech giants to remove content

New EU law imposes a time limit of one hour on tech giants to remove terrorist content. 

 

Last month, a new EU law was adopted by the European Parliament, forcing online platforms to remove terrorist content within an hour of receiving a removal order from a competent authority. According to a report from Euractiv, this regulation on preventing the dissemination of terrorist content online has faced some opposition and has been called controversial. The European Commission drafted this law on the basis of several terror attacks across the bloc. This, considered a necessary step in combating the dissemination of terrorist content online, came into effect on April 28th, after being approved by the Committee on Civil Liberties, Justice and Home Affairs in January. 

 

The proposed legislation was adopted without a vote, after approval from the Committee on Civil Liberties, Justice and Home Affairs. 

 

On January 11, the committee on civil liberties justice and home affairs (LIBE) approved this proposed legislation. There were 52 votes in favor of this law, and 14 votes against it. A decision was made to forgo a new debate in the chamber, and the proposed legislation was approved without being put to vote in the plenary. Since then, the law has come under critical eyes and some have expressed discomfort with the implementation of this new EU law, without sufficient opportunity for debate. There are several fears that this law can be abused to silence non-terrorist speech which may be considered controversial, or that tech giants may begin preemptively monitoring posts themselves using algorithms. 

 

Critics claim that such a short deadline placed on tech giants could encourage them to use more algorithms. 

 

This law has been called ‘anti-free speech’ by some critics and MEPs were urged to reject the Commission’s proposed legislation. Prior to the April 28th meeting, 61 organisations collaborated on an open letter to EU lawmakers, asking that this proposal be rejected. While the Commission has sought to calm many of those fears and worries, there remains some lingering criticism of this new EU law. Critics fear that the shortness of the deadline proposed on digital platforms to remove terrorist content may result in platforms deploying automated content moderation tools. They also note that this law could potentially be used to unfairly target and silence non-terrorist groups. The critics of this law also stated that “only courts or independent administrative authority is subject to do dishes with you should have the power to issue deletion orders”. 

 

Provisions have been added to the new EU law taking criticisms into account. 

 

In the face of criticism of the new EU law, lawmakers seem to be taking the feedback seriously and have added a number of safeguards to the proposed legislation. It has been specifically clarified that this law is not to target “material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity”. This was done in an effort to curb opportunistic efforts to use this law to target non-terrorist groups and silence them due to disagreements or misunderstandings. In addition, the regulation now states that “any requirement to take specific measures shall not include an obligation to use automated tools by the hosting service provider” in an effort to deal with the possibility of platforms feeling the need to use automated filters to monitor posts themselves. Transparency obligations have also been added to the proposed legislation, however many critics remain dissatisfied with the modifications. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.