Digital Services Act package

EU to tackle online gatekeepers through Digital Services Act and the Digital Markets Act

According to the European Commission, the Digital Services Act package is a modern legal framework which will ensure the safety of users online, and maintain a fair and open online platform environment.

The European Commission has recently announced the intended implementation of two new pieces of legislation . The Digital Services Act (DSA), and the Digital Markets Act (DMA), aim to safeguard the digital space, so that the fundamental rights of all users of digital services are protected, and to level the playing field to cultivate innovation, growth, and competitiveness, in the European Single Market and globally.

The Commission prepared this legislative package after consulting a wide range of stakeholders including inter alia, the private sector, users of digital services, civil society organisations, national authorities, academia, the technical community, international organisations and the general public. Several consultation steps were also carried out to fully capture those stakeholder’s views on various  issues related to digital services and platforms.

Digital Services Act is aimed at ensuring a safe and accountable online environment.

The DSA consists of new rules regarding digital services, placing citizens at the centre,  hope to foster growth, competitiveness, innovation, and an upscaling of smaller platforms. The rules are designed to create transparency and clear accountability for online platforms. They will better protect consumers and their fundamental rights online. Citizens can expect to see more choices, lower prices and greater protection from illegal content. This should also mitigate systemic risks like manipulation and disinformation, while establishing greater democratic control over systemic platforms. 

The Digital Services Act will govern online intermediary services, which millions of Europeans use every day like online marketplaces, social networks, content-sharing platforms, app stores online platforms, hosting services, intermediary services, with special rules expected for large online businesses with a reach of over 10% of the 450 million European consumers. Micro and small companies will have obligations proportionate to their ability and size while ensuring they remain accountable, however, all online intermediaries offering their services in the single market, whether they are established in the EU or outside, will have to comply with these new rules. 

Digital Markets Act is designed to ensure fair and open digital markets.

The DMA targets a very specific group of organisations, deemed “gatekeepers”. The act establishes a set of narrowly defined objective criteria for qualifying a large online platform as a so-called “gatekeeper”, allowing the DMA to specifically target the problem that it aims to tackle as regards large, systemic online platforms. The criteria will include companies with a strong economic position, significant impact on the internal market, a strong intermediation position, durable position in the market, and functioning in multiple EU countries. The “do’s” and “don’ts” for gatekeepers in these new rules must be adhered to in the daily operations of these companies. The Commission will carry out market investigations to ensure these rules keep up with the fast pace of digital markets.

With the establishment of this new set of rules, consumers are expected to have better access to a range of services to choose from, more opportunities to switch their providers, direct access to services, and more reasonable prices. The Act will allow gatekeepers to keep all opportunities to innovate and offer new services, however they will simply not be allowed to gain an undue advantage. The benefit of these rules to innovators and technology start-ups, are new opportunities to compete and innovate in the online platform environment and freedom from having to comply with unfair terms and conditions which limit their development. In general, business users who depend on gatekeepers to offer their services in the single market will benefit from a more equitable business environment.

The new legislative strategies proposed will address several issues across the EU, and are expected to have significant positive impact on fundamental rights online.

The new legislative strategies proposed will be applicable across the EU, and will target the significant gaps and legal burdens that still need to be addressed, despite several targeted, sector-specific interventions at an EU-level. While digital services include a wide range of online services, the rules specified in the Digital Services Act primarily concern online intermediaries and platforms like online marketplaces, social networks, content-sharing platforms, app stores as well as online travel and accommodation platforms. The Commission expects the Digital Services Act package to allow for significant benefits for consumers and innovation which these online platforms provide, to be enjoyed, while curb the issues associated with the accelerating digitalisation of society and economy.

Aphaia are leading experts in EU ICT policy, including telecoms, data protection, and AI.

Second European AI Alliance

Second European AI Alliance Assembly overview

The second European AI Alliance Assembly was hosted online due to the COVID-19 pandemic, on Friday 9th October.

The second edition of the European AI Alliance Assembly took place last Friday 9th in a full day event which was hosted online due to the COVID-19 pandemic. The Assembly gathered together more than 1,400 viewers who followed the sessions live and were also given the option to submit their questions to panellists.

The event

This year’s edition had a particular focus on the European initiative to build an Ecosystem of Excellence and Trust in Artificial Intelligence. The sessions were broken into plenaries, parallel workshops and breakout sessions. 

The following topics were addressed:

As a member of the European AI Alliance, Aphaia was pleased to join the event and enjoy some of the sessions addressing topics which are crucial in the development and implementation of AI in Europe, such as “Requirements for trustworthy AI” and “AI and liability”.

Requirements for trustworthy AI

The speakers shared their views on the risks brought by AI systems and the approaches that should be taken to enable the widespread use of AI in the society.

Hugues Bersini, Computer Science Professor at Free University of Brussels, considered that there is a cost function whenever AI is used, and optimizing it is the actual goal: “Whenever you can align social cost with individual cost there are no real issues”.

Haydn Belfield, Academic Project Manager at CSER Cambridge University, claimed that the high risk AI systems may imply for people life chances and their fundamental rights demand a framework of regulation including mandatory requirements that should, at the same time, be  flexible, adaptable and practical. 

For Francesca Rossi, IBM fellow and the IBM AI Ethics Global Leader, transparency and explainability are key. She explained that AI should be used to support the decision making capabilities of human beings, who have to make informed decisions. This purpose cannot be achieved if AI systems are a black box.

As response to the audience questions, the speakers discussed together how many risk levels would be necessary for AI. The main conclusion was that considering that only defining high risk is already a challenge, having two risk levels (high risk and not high risk) would be a good start on which further developments may be built in the future.

The speakers briefly talked about each of the requirements highlighted by the AI-HLEG for trustworthy AI, namely: human agency and oversight, technical robustness and safety, privacy and data governance, transparency, diversity, non-discrimination and fairness, societal and environmental well-being and accountability. 

In our view the discussions on AI and biases and human oversight where especially relevant:

AI and biases

Paul Lukowicz, Scientific Director and Head of the Research Unit “Embedded Intelligence” at the German Research Center for Artificial Intelligence defined machine learning as giving the computer methods with which it can extract procedures and information from data and stated that it is the core of the current success of AI. The challenge is that a lot of biases and discrimination in AI system come from giving data in which there are biases and discrimination. The challenge is that it is not that the developers somehow fail in giving data which is not per se representative: they actually use data that is representative and because there are discrimination and bias in our everyday life, this is what the systems learn and empathize. Linked to this issue, he considers that another pitfall is  the uncertainty, as there is no data set that covers the entire world. “We always have a level of uncertainty in life, so we have in AI systems”. 

Human oversight

Aimee Van Wynsberghe, Associate Professor in Ethics and Technology at TU Delft, raised some obstacles to human oversight:

  1. She challenged the fact that the output of an AI system is not valid until it has been reviewed and validated by an human. In her view, this can be quite difficult because there are biases that threaten human autonomy: automation bias, simulation bias, confirmation bias. Humans have a tendency to favor suggestions from automated decision-making system and ignore contradictory information that is made without automation. The other challenge in this regard is that having an AI system creating output and the human overviewing and validating is very time and resources consuming.
  2. As for the alternative based on the fact that the outputs of the AI system would become immediately effective but only if human overview is ensured afterwards, Aimee pointed out the issue of allocating the responsibility of ensuring human intervention:  “Who is going to ensure that human intervention happens? The company? Is the customer who would approach the company otherwise? Is it fair to assume that customers would have the time, the knowledge and the ability to do this?
  3. The monitoring of the AI system while in operation and the ability to intervene in real time and deactivate it would be difficult too because of human psychology: “there is lack of situational awareness that does not allow for the ability to take over”.

AI an liability

Corinna Schulze, Director of EU Government Affairs at SAP; Marco Bona, PEOPIL’s General Board Member for Italy and International Personal Injury Expert; Bernhard Koch, Professor of Civil and Comparative Law and Member of the New Technologies Formation of the EU Expert Group on liability for new technologies; Jean-Sébastien Borghetti, Private Law Professor at Université Paris II Panthéon-Assas and Dirk Staudenmaier, Head of Unit and Contract Law in the Department of Justice of the European Commission discussed the most important shortcomings on AI liability and put it in connection with the Product Liability Directive.

The following issues of the Directive were pointed out by the experts:

  • Time limit of 10 years: in the view of most of the speakers this may be an issue because it concerns producers only, which could be difficult whenever operators, users, owners and other stakeholders are involved. Furthermore 10 years is fine with  traditional products but it may not work in terms of protection of victims in relation to some AI artefacts and systems.
  • Scope: it concerns the protection of the consumers while it not address the protection of victims. Consumers and victims sometimes overlap but not always.
  • Notion of defect: it may cause some troubles with the distinction between product and services. The Directive covers only product, not services which may rise some concerns in relation to internet of things and software.


The Commission has made available the links of the sessions for all those who did not manage to attend the event or would like to see one or more session again.

Do you need assistance with AI Ethics? We can help you. Our comprehensive services cover both Data Protection Impact Assessments and AI Ethics Assessments.

EU-US Privacy Shield

EU-US Privacy Shield invalidation business implications follow-up

Since the Court of Justice of the European Union (CJEU) invalidated the EU-US Privacy Shield in their Schrems II judgement delivered two weeks ago, many questions have arisen around international data transfers to the US.

After the invalidation of the EU-US Privacy Shield by the CJEU two weeks ago, as reported by Aphaia, data transfers to the US require another valid safeguard or mechanism that provides an adequate level of data protection similar to the one granted by the GDPR.

European Data Protection Board guidelines

With the aim of clarifying the main issues derived from the invalidation of the EU-US Privacy Shield, the European Data Protection Board (EDPB) has published Frequently Asked Questions on the Schrems II judgement. These answers are expected to be developed and complemented along with further analysis, as the EDPB continues to examine and assess the CJEU decision.

In the document, the EDPB reminds that there is no grace period during which the EU-US Privacy Shield is still deemed a valid mechanisms to transfer personal data to the US, therefore businesses that were relying on this safeguard and that wish to keep on transferring data to the US should find another valid safeguard which ensures compliance with the level of protection essentially equivalent to that guaranteed within the EU by the GDPR.

What about Standard Contractual Clauses?

The CJEU considered the SCC validity depends on the ability of the data exporter and the recipient of the data to verify, prior to any transfer, and taking into account the specific circumstances, whether that level of protection can be respected in the US. This seems to be difficult though, because the Court found that US law (i.e., Section 702 FISA and EO 12333) does not ensure an essentially equivalent level of protection.

The data importer should inform the data exporter of any inability to comply with the SCCs and where necessary with any supplementary measures and the data exporter should carry out an assessment to ensure that US law does not impinge on the adequate level of protection, taking into account the circumstances of the transfer and the supplementary measures that could be put in place. The data exporter may contact the data importer to verify the legislation of its country and collaborate for the assessment. Where the result is not favourable, the transfer should be suspended. Otherwise the data exporter should notify the competent Supervisory Authority.

What about Binding Corporate Rules (BCRs)?

Given that the reason of invalidating the EU-US Privacy Shield was the degree of interference created by the US law, the CJEU judgement applies as well in the context of BCRs, since US law will also have primacy over this tool. Likewise before using SCCs, an assessment should be run by the data exporter and the competent Supervisory Authority should be reported where the result is not favourable and the data exporter plans to continue with the transfer.

What about derogations of Article 49 GDPR?

Article 49 GDPR comprises further conditions under which personal data can be transferred to a third-country in the absence of an adequacy decision and appropriate safeguards such as SCCs and BCRs, namely:

  • Consent. The CJEU points out that consent should be explicit, specific for the particular data transfer or set of transfers and informed. This element involves practical obstacles when it comes to businesses processing data from their customers, as this would imply, for instance, asking for all customers’ individual consent before storing their data on Sales Force.
  • Performance of a contract between the data subject and the controller. It is important to note that this only applies where the transfer is occasional and only for those that are objectively necessary for the performance of the contract.

What about third countries other than the US?

The CJEU has indicated that SCCs as a rule can still be used to transfer data to a third country, however the threshold set by the CJEU for transfers to the US applies for any third country, and the same goes for BCRs.

What should I do when it comes to processors transferring data to the US?

Pursuant to the EDPB FAQs, where no supplementary measures can be provided to ensure that US law does not impinge on the essentially equivalent level of protection as granted by the GDPR and if derogations under Article 49 GDPR do not apply, “the only solution is to negotiate an amendment or supplementary clause to your contract to forbid transfers to the US. Data should not only be stored but also administered elsewhere than in the US”.

What can we expect from the CJEU next?

The EDPB is currently analysing the CJEU judgment to determine the kind of supplementary measures that could be provided in addition to SCCs or BCRs, whether legal, technical or organisational measures.

ICO statement

The ICO is continuously updating their statement on the CJEU Schrems II judgement. The latest version so far dates 27th July and it confirms that EDPB FAQs still apply to UK controllers and processors. Until further guidance is provided by EU bodies and institutions, the ICO recommends to take stock of the international transfers businesses make and react promptly plus they claim that they will continue to apply a risk-based and proportionate approach in accordance with their Regulatory Action Policy.

Other European Data Protection Authorities’ statements

Some European data protection supervisory authorities have provided guidance in response to the CJEU Schrems II judgement. While most countries are still considering the implications of the decision, some other are warning about the risk of non-compliance and a few of them like Germany (particularly Berlin and Hamburg) and Netherlands have openly stated that transfers to the US are unlawful.

In general terms, the ones that are warning about the risks claim the following:

  • Data transfers to the U.S. are still possible, but require the implementation of additional safeguards.
  • The obligation to implement the requirements contained in the CJEU’s decision is both on the businesses and the data protection supervisory authorities.
  • Businesses are required to constantly monitor the level of protection in the data importer’s country
  • Businesses should run a previous assessment before transferring data to the US.

The data protection supervisory authority in Germany (Rhineland-Palatinate) has proposed a five-step assessment for businesses. We have prepared the diagram below which summarizes it:

Can the level of data protection required by the GDPR be respected in the US?

The CJEU considered that the requirements of US domestic law and, in particular, certain programmes enabling access by US public authorities to personal data transferred from the EU, result in limitations on the protection of personal data which do not satisfy GDPR requirements. Furthermore, the CJEU stated that US legislation does not gran data subjects actionable rights before the courts against the US authorities. 

In this context, it seems difficult that a company could be able to demonstrate that they can provide an adequate level of data protection to personal data transferred from the EU, because basically it would have to bypass US legislation.

Latest moves in the US Senate does not shed light in this issue, because the “Lawful Access to Encrypted Data Act” was introduced last month. It mandates service providers and device manufacturers to assist law enforcement with accessing encrypted data if assistance would aid in the execution of a lawfully obtained warrant.

Do you make international data transfers to third countries? Are you affected by Schrems II decision? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We also offer CCPA compliance servicesContact us today.

BCR Changes for Brexit

BCR Changes for Brexit: EDPB releases statement guiding enterprises.

The European Data Protection Board (EDPB) released a statement of guidance on Binding Corporate Rules (BCRs), for groups of undertakings, or enterprises which have the UK ICO as their lead supervisory authority (BCR Lead SA).


The EDPB released a statement of guidance on Binding Corporate Rules (BCRs), for groups of undertakings, or enterprises which have the UK ICO as their lead supervisory authority (BCR Lead SA). As shifts are made towards the official implementation of Brexit, many structural and procedural changes are being made for businesses. One such change, adopted on July 22, 2020, based on the analysis currently undertaken by the EDBP on the consequences of the CJEU judgment,  Data Protection Commissioner v Facebook Ireland, and Schrems, regarding BCRs as transfer tools. The EDPB recently released a statement outlining BCR changes for Brexit implementation, complete with a table guide regarding the criteria for a BCR Lead SA change, how and why, and referencing the legislation for each criteria. 


Procedural Changes for Authorized BCR Holders


Enterprise holders with the ICO as their competent Supervisory Authority (BCR Lead SA) will need to arrange for a new BCR Lead in the EEA, according to Article 29 Working Party, Working Document Setting Forth a Co-Operation Procedure for the approval of BCRs for controllers and processors under the GDPR, WP263 rev.01, endorsed by the EDPB. This change in BCR Lead will need to take place before the end of the Brexit transition period. For BCRs already approved under the GDPR, the new BCR Lead SA in the EEA will have to issue a new approval decision following an opinion from the EDPB. However, no approval by the new BCR Lead SA is necessary for BCRs for which the ICO acted as their BCR Lead SA under Directive 95/46/EC. 


Content Changes for Authorized BCR Holders.


Before the end of the Brexit transition period, BCR holders with the UK’s ICO as their BCR Lead SA will need to amend their BCRs, referencing the EEA legal order. Without these changes (or a new approval, where applicable), by the end of the transition period, these enterprises or groups of undertakings will no longer be able to use their BCRs for transfers of data outside the EEA beyond the transition period.


Procedural Changes for BCR Applications Before the ICO.


Any groups of undertakings of enterprises with BCRs at the review stage with the ICO are encouraged to identify a new BCR Lead SA according to the guidance of the WP263 rev.01 before the end of the Brexit transition period. They will need to contact the new SA and provide the necessary information to apply to have the SA considered as the new BCR Lead SA. The new BCR Lead SA will then take over the application process and begin the aproval procedure, subject to an opinion of the EDPB. 


Groups of undertakings or enterprises may choose to transfer their application to a new BCR Lead SA after approval by the ICO, in which case, the new BCR Lead SA will need to approve this new application before the end of the transition period, as the new competent SA, according to Article 47.1 GDPR.


Content Changes for BCR Applications Before the ICO.


Groups of undertakings or enterprises with BCRs in the process of approval by the ICO must make sure that their BCRs refer to the EEA legal order with information on expected changes, before the end of the Brexit transition period. 


General Changes for BCR Applications 


Any Supervisory Authority in the EEA, approached to act as the new BCR Lead SA, will consider whether it is indeed the appropriate SA on a case by case basis, based on the criteria of the WP263 and in collaboration with any other concerned Supervisory Authorities. The EDPB has provided a checklist of elements for Controller and Processor BCRs which need to be changed due to Brexit, as part of this statement released last month. 


Does your company have the UK ICO as their lead supervisory authority? If so, you may be required to make significant changes before the end of the Brexit transition period. Aphaia’s data protection impact assessments, GDPR and Data Protection Act 2018 consultancy services and Data Protection Officer outsourcing will assist you with ensuring compliance.