New EU law

New EU law imposes a time limit on tech giants to remove content

New EU law imposes a time limit of one hour on tech giants to remove terrorist content. 

 

Last month, a new EU law was adopted by the European Parliament, forcing online platforms to remove terrorist content within an hour of receiving a removal order from a competent authority. According to a report from Euractiv, this regulation on preventing the dissemination of terrorist content online has faced some opposition and has been called controversial. The European Commission drafted this law on the basis of several terror attacks across the bloc. This, considered a necessary step in combating the dissemination of terrorist content online, came into effect on April 28th, after being approved by the Committee on Civil Liberties, Justice and Home Affairs in January. 

 

The proposed legislation was adopted without a vote, after approval from the Committee on Civil Liberties, Justice and Home Affairs. 

 

On January 11, the committee on civil liberties justice and home affairs (LIBE) approved this proposed legislation. There were 52 votes in favor of this law, and 14 votes against it. A decision was made to forgo a new debate in the chamber, and the proposed legislation was approved without being put to vote in the plenary. Since then, the law has come under critical eyes and some have expressed discomfort with the implementation of this new EU law, without sufficient opportunity for debate. There are several fears that this law can be abused to silence non-terrorist speech which may be considered controversial, or that tech giants may begin preemptively monitoring posts themselves using algorithms. 

 

Critics claim that such a short deadline placed on tech giants could encourage them to use more algorithms. 

 

This law has been called ‘anti-free speech’ by some critics and MEPs were urged to reject the Commission’s proposed legislation. Prior to the April 28th meeting, 61 organisations collaborated on an open letter to EU lawmakers, asking that this proposal be rejected. While the Commission has sought to calm many of those fears and worries, there remains some lingering criticism of this new EU law. Critics fear that the shortness of the deadline proposed on digital platforms to remove terrorist content may result in platforms deploying automated content moderation tools. They also note that this law could potentially be used to unfairly target and silence non-terrorist groups. The critics of this law also stated that “only courts or independent administrative authority is subject to do dishes with you should have the power to issue deletion orders”. 

 

Provisions have been added to the new EU law taking criticisms into account. 

 

In the face of criticism of the new EU law, lawmakers seem to be taking the feedback seriously and have added a number of safeguards to the proposed legislation. It has been specifically clarified that this law is not to target “material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity”. This was done in an effort to curb opportunistic efforts to use this law to target non-terrorist groups and silence them due to disagreements or misunderstandings. In addition, the regulation now states that “any requirement to take specific measures shall not include an obligation to use automated tools by the hosting service provider” in an effort to deal with the possibility of platforms feeling the need to use automated filters to monitor posts themselves. Transparency obligations have also been added to the proposed legislation, however many critics remain dissatisfied with the modifications. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

SCCs and Privacy Shield

SCCs and Privacy Shield replacement updates, what can we expect?

SCCs and Privacy Shield replacement are both of paramount importance to trans-Atlantic data flows, however, right now the focus may be more on new SCCs. 

 

 Almost one year since the CJEU “Schrems II” decision, a new EU-US privacy shield may still be far off. However, with Standard Contractual Clauses being upheld and used quite frequently to facilitate cross border data flows, new SCCs can be expected soon. According to this IAPP article, new SCCs may be here within a matter of weeks. Bruno Gencarelli, Head of International Data Flows and Protection at the European Commission said “We are about to because it’s a question of weeks, adopt modernized SCCs that do things that are aligned with the (EU General Data Protection Regulation) that are much better adapted to the reality of today’s digital economy”.

 

The new Standard Contractual Clauses are expected to be here in short order, and the Commission considers the feedback received on the draft SCCs. 

 

Since the Schrems II decision, SCCs have been upheld, but with a few caveats. They have been put to use to facilitate data flows between the EU and the US, however this has not been without incidence. While privacy professionals wait for conclusive information regarding data flows across the Atlantic, there have been some recent developments. Bruno Gencarelli, during IAPP’s Global Privacy Summit Online, said that the new Standard Contractual Clauses will soon be adopted. Gencarelli, based on the feedback the European Commission received, called the draft SCCs an “enormous success”, with the Commission taking this feedback very seriously. The ongoing process is intended to modernize the SCCs to better suit the current digital climate’s size and complexity. 

 

“This is a much awaited step forward which once in place will help to unify the dissimilar criterion that EU Supervisory Authorities have been applying since Schrems II when it comes to international data transfers, as we have recently seen with the Bavarian and French DPAs decisions” comments Cristina Contero Almagro, Aphaia’s Partner.

 

Privacy Shield replacement negotiation is intensifying, but a privacy shield replacement may still be far off. 

 

While there is a willingness on each side to make a deal on a replacement for Privacy Shield, it is a balancing act between privacy and national security, making this a delicate, and complex situation. As we have seen since Schrems II, SCCs, while very useful, may not always be enough. As each side seeks to create a durable replacement for Privacy Shield, one that can stand up to legal challenges and political scrutiny, talks are underway for a solution that will meet the needs of both parties with regards to both privacy and national security.  

 

Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcing.  Contact us today.

The new EU AI Regulation

The new EU AI Regulation : leaked document reveals intentions to establish rules for high risk AI

New EU AI Regulation revealed in a recently leaked document, include several intended legislations specific to high risk AI. 

 

The European Commission has proposed a Regulation of the European Parliament and of the Council aimed at governing the use, and sale of high risk AI within the European Union. In a recently leaked document, the organisations stated that “Artificial intelligence should not be an end in itself, but a tool that has to serve people with the ultimate aim of increasing human well-being.” With that in mind, the European Commission has set forth to further define and clarify what constitutes high risk AI, and set out rules aimed at ensuring that AI is safely and effectively serving the highest good of natural persons. “It is the step towards the right direction, providing further ethical safeguards as our lifes are becoming increasingly dominaed by machine-made decisions”, comments Dr Bostjan Makarovic, Aphaia’s Managing Partner.

 

The document outlines harmonised rules concerning the placing on the market, putting into service and use of high-risk AI systems in the Union. It also includes harmonised transparency rules for AI systems intended to interact with natural persons and AI systems used to generate or manipulate image, audio or video content. The regulatory framework laid out in this document is intended to function without prejudice to the provisions of existing Union regulations applicable to AI, falling within the scope of this regulation. 

 

The new AI Regulation will apply to providers and users of AI systems in the EU or in third countries to the extent that they affect persons in the EU.

 

This Regulation will apply to providers of AI systems in the practice of placing them on the market or putting them into service in the European Union whether they were established in the EU, or within a third country outside the Union. In addition, the regulations will apply to users of AI systems in the EU, as well as providers and users of AI systems established in third countries to the extent that these systems affect persons within the EU. 

 

Article 3 of the leaked document defines an AI system as “software that is developed with one or more of the approaches and techniques listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing real or virtual environments.” This can constitute a component of a product or a standalone product, the output of which may serve to partially or fully automate certain activities. Annex I outlines several approaches and techniques which indicate artificial intelligence including Machine learning approaches, including supervised and supervised and reinforcement learning, logic and knowledge based approaches, and statistical approaches. 

 

The leaked document outlines several prohibitions intended to be established for the protection of the fundamental rights of natural persons. 

 

This document goes on in Article 4 to outline the prohibited AI practices; a list of artificial intelligence practices which are prohibited as contravening the fundamental rights protected under EU law, and Union values. Title II Article 4 prohibits the use of AI systems that manipulate human behavior, opinions or decisions through choice architectures or any other element of the user interface, which causes persons to make decisions, have opinions or behave in a manner that is to their detriment. In addition, this regulation prohibits the use of AI in any manner that exploits information or predictions about people in an effort to target their vulnerabilities, leading them to behave, form an opinion or make decisions to their detriment. The regulation will also prohibit indiscriminate surveillance applied to all natural persons in a generalised manner without differentiation. Article 4(2) does however state that these prohibitions do not apply when authorised by law, or are carried out by, or on behalf of public authorities in order to safeguard public security, subject to appropriate safeguards to the rights and freedoms of third parties and compliance with the laws of the EU. 

 

Cristina Contero Almagro, Partner in Aphaia, points out that “It should be noted that this new Regulation mentions “monitoring and tracking of natural persons in digital or physical environments, as well as automated aggregation and analysis of personal data from various sources” as elements that the methods of surveillance could include, which means that these provisions might potentially reach any online platform which relies on automated data aggregation and analysis. 

 

Considering that the Regulation takes a risk-based approach and that it interlinks with the GDPR in some areas, this only confirms the importance for businesses to ensure that their systems and processes comply with the data protection framework. In particular, Data Protection Impact Assessments, over which the Conformity Assessment would be built, play a paramount role”.

 

The new AI Regulation specifically defines what constitutes high risk AI systems, in order to outline exactly which systems will be subject to this regulation. 

 

With regard to high risk AI systems, this document has a specific section (Annex II) dedicated to defining with precision and complete with examples, what constitutes a “high risk artificial intelligence system”. Anything that falls within that purview is subject to specific rules and regulations to ensure the best interest of persons. Compliance with these requirements will be required before the placement of these systems on the market or into service. The regulation covers the use of data sets for training these systems, documentation and record keeping, transparency, robustness, accuracy, security, and human oversight. The leaked document includes several obligations for the providers and users of these systems, as well as authorised representatives, importers and distributors.

 

The Regulation sets forth intended measures to support innovation in AI and aid SMEs in ensuring compliance. 

 

This document also sets forth intended measures in support of innovation in AI. These include AI regulatory sandboxing schemes, allowed to be established by national competent authorities in any of the Member States. In order to reduce the regulatory burden for small to medium enterprises and startups, additional measures will be implemented, including priority access to these sandboxes, as well as digital hubs and testing experimentation facilities. These hubs are intended to provide relevant training to AI providers on the regulatory requirements, as well as technical and scientific support, and testing facilities for providers. 

 

The new AI Regulation indicates the intention for the establishment of a European Artificial Intelligence Board. 

 

The document indicated the intention to establish a European Artificial Intelligence Board, tasked with the consistent application of this regulation by the Member States. This task force will be expected to issue opinions  or interpretive guidance documents clarifying the application of the Regulation, collect and share best practices among Member States, aid in the development of standards regarding AI, and continuously monitor developments in the market and their impact on fundamental rights. The European Artificial Intelligence Board will also be expected to ensure consistency and coordination in the functioning of the AI regulatory sandboxes previously mentioned. This board will issue opinions before the Commission adopts a delegated act and coordinate, in carrying out its tasks, with the relevant bodies and structures established at an EU level, including the EDPB. 

 

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides EU AI Ethics Assessments, Data Protection Officer outsourcing and ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments. We can help your company get on track towards full compliance.

Digital Green Certificates

Digital Green Certificates: the EDPB and EDPS release a joint opinion

Digital Green Certificates have been a topic of debate lately, and the EDPB & EDPS have released a joint opinion on this, regarding data protection and privacy.

Digital Green Certificates, which some refer to as “vaccine passports” are, contrary to popular belief, not specific to vaccines. In actuality, the digital green certificates or passes, as they would preferably be called, are proposed to be a QR code with information on a person’s status with regard to the COVID-19 virus. The specifics of the information may be pertaining to the vaccine and have details on which vaccine was taken and when it was administered, or it may contain information on a negative COVID-19 test and the date on which the last test was taken. This scannable code may also contain information on antibodies present in a person’s system, if they have developed antibodies from being infected with and recovering from this virus. Vaccines are not mandatory at this time, and the digital green certificates proposed by the European Commission are intended to make it easier to identify someone’s current status with regard to COVID-19, whether vaccinated or not, making travel throughout the EU more seamless, for anyone traveling during this global pandemic. 

The EDPB and EDPS released this joint statement specific to the aspects of the Proposal pertaining to personal data protection. 

The Commission first published the proposal for a Regulation of the European Parliament and of the Council the issuance, verification and acceptance of certificates of vaccination, testing and recovery to third-country nationals who are legally staying or residing in any of the EU Member States during the COVID-19 pandemic on March 17th. The EDPB & EDPS note that the aim of this proposal is to facilitate the exercise of the right to free movement within the EU during the COVID-19 pandemic. Due to the particular importance of these proposals and their impact on individual rights and freedoms regarding the processing of personal data, the EDPB and EDPS released their joint opinion specific to the aspects of the proposal relating to personal data protection. The organisations highlight that it is essential that the proposal is consistent and does not, in any way conflict with the application of the GDPR. 

Digital Green Certificates should be approached from a holistic and ethical standpoint, as asserted by the EDPB and EDPS in their joint opinion. 

The EDPB and EDPS suggest that the Commission take a holistic and ethical approach to the proposal in an effort to encompass all the issues related to privacy and data protection, and fundamental rights in general. They note that data protection is not an obstacle to fighting the current pandemic and that compliance with data protection law will only aid by helping citizens trust the frameworks provided in those efforts. The EDPB and EDPS advise that any measure adopted by Member States or EU institutions must be guided by the general principles of effectiveness, necessity and proportionality. In addition, they note that the World Health Organisation (WHO) in its ‘ interim position paper: considerations regarding proof of COVID-19 vaccination for international travelers’ stated that “(…) national authorities and conveyance operators should not introduce requirements of proof of COVID-19 vaccination for international travel as a condition for departure or entry, given that there are still critical unknowns regarding the efficacy of vaccination in reducing transmission.” 

The EDPB and EDPS, in their joint opinion, state that these green certificates must not lead to the creation of any central database of personal data at the EU level, under the pretext of the Digital Green Certificate framework. In addition, they made specific mention that these certificates should be made available in both digital and paper based formats, to ensure the inclusion of all citizens, regardless of their level of engagement with technology. The organisations also call for clarification on the proposal’s stance on the manner in which these certificates will be issued, whether automatically, or upon request of the data subject. Recital 14 and Articles 5(1) and 6(1) of the Proposal currently state “(…) Member States should issue the certificates making up the Digital Green Certificate automatically or upon request (…)”

The EDPB and EDPS are glad to note the considerations to the rights and freedoms of individuals, as well as compliance with data protection regulation, included in the Proposal. 

The organisations are pleased to note that the Proposal explicitly states that compliance with European data protection regulation is key to the cross border acceptance of vaccination, test and recovery certificates. Recital 38 of the proposal states that “[i]n line with the principle of minimisation of personal data, the certificates should only contain the personal data necessary for the purpose of facilitating the exercise of the right to free movement within the union during the COVID-19 pandemic”. The EDPB and EDPS recommend the inclusion of reference to the GDPR in the main text of the proposal, as it is the legal basis for the processing of personal data, for the issuance and verification of interoperable certificates, as acknowledged in Recital 37. 

Article 3(3) of the Proposal states that citizens can obtain these certificates free of charge,and may renew these certificates to bring the information up to date, or replace as necessary. While the EDPB and EDPS commend this, the organisations also recommend clarifying that the original certificate, as well as modifications shall be issued upon request of the data subject. This is very important for maintaining accessibility for all persons. 

The EDPB and EDPS call for attention to data minimisation, as well as clarification on the validity period of the data processed. 

There are naturally certain categories and data fields of personal data which would need to be processed within the framework of the Digital Green Certificates. As a result, the EDPD and EDPS consider that the justification for the need for personal data fields needs to be clearly defined in the Proposal. In addition, the organizations ask that further explanation be provided as to whether all of the categories of personal data provided for are necessary for inclusion in the QR code for both digital and paper certificates. They note that data minimisation can be achieved using an approach of differently comprehensive data sets or QR codes. In addition, the organizations note the lack of specificity with regard to an expiry date or validity period for each certificate in the draft Proposal. It is also important to note that the EDPB and EDPS clearly state that given the scope of the draft of the proposal, and the context of the global pandemic, the statement of the disease or agent from which the individual has recovered should only be limited to COVID-19 and its variants. 

The EDPB & EDPS iterate the importance of adequate technical and organizational privacy and security measures in the context of the proposal.

With regard to the Digital Green Certificate, the organizations suggest that privacy and security measures should be specially structured to ensure compliance by the controllers and processors of personal data required by this framework.  The opinion states that controllers and processors should take adequate technical and organizational measures to ensure a level of security that is appropriate to the level of risk of the processing of this personal data in line with Article 32 of the GDPR. These measures should include the establishment of processes for regular assessment of the effectiveness of the privacy and security measures which are adopted. 

While the EDPB and EDPS are pleased to note the clarification, within the Proposal, of the roles of data controllers and processors, the organisations suggest that the Proposal specify, through a comprehensive list, all entities foreseen to be acting as controllers or processors of the data in EU Member States, taking into account the use of these certificates in multiple member states by persons traveling throughout the EU. They also suggest that the Proposal should provide clarification on the role of the Commission with regard to data protection law in the context of the framework, guaranteeing interoperability between the certificates. In addition, the organisations call for attention to compliance with Article 5(1)(e) of the GDPR, with regard to the storage of personal data, as well as clarification on the storage period that Member States should not exceed, beyond the pandemic. Furthermore, the EDPB and the EDPS recommend that the Commission explicitly clarifies whether, and when any international transfers of personal data are expected, as well as safeguards within the legislation to ensure that third countries will only process the personal data for the specific purposes that this data is exchanged, according to the framework.

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.