ICT regulation

ICT regulation in 2021: Four things to look out for

From regulation of Big Tech to the upcoming legislative framework for AI, I have identified the key areas to be on the lookout for in 2021 when it comes to regulating ICT.


  1. Regulating digital gatekeepers


Big Tech has been on the regulators’ and legislators’ radar for a while, with earlier EU antirust fines imposed on Google and the introduction of more serious fines by the GDPR. While a number of antitrust cases have been filed against Facebook in the US in late 2019, the latest European Commission proposal signifies more considerable innovation when it comes to regulating Big Tech.

The proposed Digital Markets Act creates something new: asymmetric, market power-based regulation of digital gatekeepers. Whereas the new Digital Services Act continues to build on the existing consumer protection philosophy, which only acknowledges the asymmetry between the consumer and the business, Digital Markets Act only imposes remedies on those digital platforms that have an entrenched, durable intermediary position. This is akin to asymmetric regulation of telecoms operators with Significant Market Power.

In a manner that resembles telecoms infrastructure regulation, the Digital Markets Act seeks to grant access to the Big Tech platforms by means of ‘unbundling’ some of their features. Considering the regulators’ and legislators’ reluctance to regulate Internet ‘content’ since late 1990s, such ex ante measures can be seen as truly historic. 


  1. AI regulation


Following the introduction of GDPR rules on profiling and human intervention, the Ethics Guidelines for Trustworthy AI prepared by the EU High-Level Expert Group on Artificial Intelligence (AI HLEG) have provided a strong hint that we can expect a horizontal legislative action in the area of AI.

Following a public consultation in 2020, European Commission expects to unveil a legislative proposal regulating AI in the first quarter of 2021. It remains to be seen to what extent will European legislators transpose ethical principles identified by the AI HLEG, such as human agency and oversight or technical robustness and safety, into mandatory legal obligations.


  1. European Electronic Communications Code (EECC)


The EECC, a new Directive incorporating most of the EU electronic communications legislation, was due for implementation on 21st December 2020. With a few Member States still lagging behind the schedule, often due to COVID-19, European Commission has already adopted Delegated regulation on EU-wide voice-call termination rates for both fixed and mobile calls. This further reduces wholesale prices of voice calls within the EU with the aim of further reduction of retail prices.

The impact of the EECC on telecoms markets remains to be seen. The new Directive gives national regulatory authorities more options to tackle market failure through commitments of dominant players. It modernises and further harmonises the rules on spectrum with a view to 5G and technologies to follow, plus introduces basic regulation protection customers using number-independent OTT communications services. The latter have until now largely been excluded from the regulation of the telecoms sector.

Practical effects will largely depend on the implementation in the Member States. For example, will the regulators be able to leverage regulatory sticks and carrots to foster the emergence of wholesale-only broadband infrastructure players? Implementation at the national level will also be crucial to reap the full benefits of the new spectrum management rules, according to Vesna Prodnik of Vafer, a specialised mobile telecoms consultancy: “Member States still have a wide discretion as to the exact rules on simplifying small-area wireless cells placement, which are crucial for the necessary 5G network density.”


  1. Electronic identity


As even larger amount of business and private life moves online because of COVID-19 pandemic, online identity fraud has become even more rampant. Should governments centralise the approach to e-identity? Or should one rely on decentralised, commercially offered identity solutions? Should everyone receive an ID certificate, or another means of verifying who they are in all online environments?

The EU eIDAS Regulation has introduced an interoperability framework for EU citizens using their own national electronic identification schemes (eIDs) to access public services in other Member States. It has further created an internal market for trust services – namely electronic signatures, electronic seals, time stamp, electronic delivery service and website authentication – by ensuring that they will work across borders and have the same legal status as their traditional paper based equivalents. 

Despite these developments, we still seem to be far from a uniform and universally accepted electronic IDs, especially at the international level. A further push may come from the European Commision’s review of the eIDAS regulation, which is currently underway following an open public consultation.


Next steps for ICT businesses


  • Check how your online operations might be affected in the future by the additional obligations proposed by the EU Digital Services Act
  • If you develop or deploy AI solutions, consider doing an AI ethics impact assessment to ensure their long-term viability
  • Check if any of the services you provide online might be classified as interpersonal communications services and therefore subject to EECC regulation

At Aphaia, we will continue to keep up with these developments. Please reach out if your business requires assistance with any of them. You can visit us at https://aphaia.consulting to explore our full array of services.

Opportunities for online marketplaces

Opportunities for online marketplaces have grown during COVID

Opportunities for online marketplaces have grown during the COVID-19 pandemics, from second-hand clothes to artworks. Since privacy risks grow with the number of users and transactions, this is a good time for the platforms to review their approach to data protection.

This article builds on my earlier article on COVID-19 business adaptation published as part of Aphaia Blog industry adaptation series. It is based on my own and Aphaia broader team’s practical experience, plus insights from some awesome clients and industry experts.

Online marketplace culture taking over

According to Fabio Occhiuzzo, Operations Manager at Depop, a global second-hand marketplace for fashion items, “this unique moment in time will encourage more and more people to reconsider resale as an alternative to shopping new and therefore cause a long-term channel shift in how we consume fashion, especially as we see the pandemic reveal the realities of brick and mortar and the benefit of digital commerce.”

Depop’s insight reveals the above is more than just about the availability and convenience, but about a broader cultural, environmental and ethical shift. “We also know that, with resale being a primary function of our marketplace, we’re champions of responsible fashion consumption and this pandemic is really shining a light on the realities of the current fashion ecosystem. Depop represents a move in the right direction for fashion because we extend the life of millions of items, which helps reduce waste. We want to use our reach to drive positive changes across the industry, making fashion circular. This means generating a culture that’s based around self-expression, creativity and creating more value in what you already own. And it’s something we’re hearing from our community more and more during this time – they’re the champions of this movement and they expect consumers to also be more mindful of the environment after this pandemic ends,” concludes Mr Occiuzzo. 

A continuation of the trend towards remote selling is also acknowledged by Arianna Perini, an arts management professional: “COVID-19 only fuelled this development. During art auctions, for example, most top-lots had already been sold through telephone bids, showing that physical presence was not a must in order to buy an artwork, regardless of its price.” She further points at the emergence of wider online art marketplaces that have, in a way not dissimilar to online marketplaces in other industries, lowered the online market entry threshold for smaller players. “Gagosian and Zwirner have opened its online platform to host smaller galleries that could not afford to launch their own online platforms. The same online shift has been undertaken by the major art fairs, such as Art Basel and Frieze,” says Ms Perini.

Online marketplaces and GDPR

Data protection implications for online marketplaces might at first glance be similar to other online businesses. True, an online marketplace would typically process the data of its customers, both sellers and buyers. However, other persons’ data might be involved: selling an artwork would involve the use of the artist’s name and other information. Selling a fashion item might involve a personally identifiable photo of a model. Under each of these scenarios, an online marketplace needs to be prepared to process the data of third parties who are not their customers, which needs to be reflected in their privacy policies.

Furthermore, depending on their business model, some online marketplaces may act as data processors on behalf of their participants. In that case, they need to enter into a data processing agreement with their participants that includes all the mandatory components mentioned in Article 28 GDPR. Most importantly, a data processor is not allowed to process their customers’ data on their own behalf. Accordingly, this model would typically be suitable for those platforms that focus on underlying marketplace technology and do not intend to benefit from content analytics for their own purposes.

Online marketplaces, ePrivacy Regulation and NIS Directive

Online marketplaces regularly include data less common in other online services. Notably, online marketplaces typically include a peer-to-peer messaging feature that enables direct contact between buyers and sellers. Whereas such messaging platforms typically require a degree of supervision for fraud prevention and user safety purposes, one should also note that message content is subject to a higher expectation of privacy than other user data processed. Notably, the Proposal for the new EU ePrivacy Regulation extends communications privacy protection to messaging services that are ancillary to other services, such as online marketplaces or online gaming.

In addition, online marketplaces may be captured by additional security requirements of the Directive (EU) 2016/1148 concerning measures for a high common level of security of network and information systems across the Union, known as the NIS Directive. These include the requirement for a separate notification of security incidents or data breaches, in addition to a similar requirement of the GDPR.

Key data privacy tips for online marketplaces

If you are an online marketplace and believe you have not yet sufficiently looked deeper into the data privacy aspects of your business, here are some key tips:

  • review your Privacy Policy: ensure it captures all the data and all the individuals whose personal data you are processing;
  • review the privacy regime for your messaging feature: is there the right balance between fraud prevention, security and privacy of the participants?
  • check if your online marketplace might be captured by your country’s legislation on NIS security incident reporting requirements.
  • check whether your Privacy Policy makes it clear whether you are using artificial intelligence (AI) to profile buyers and sellers when matching them;
  • ensure buyers and sellers are informed with whom you might be sharing their data, including their preferences;
  • if you are a processor for the sellers, make sure your Terms of Service include an Article 28 GDPR data processing agreement.

Do you have an online Marketplace and have questions about how to comply with all the privacy and data protection requirements? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Business adaptation to COVID-19

Business adaptation to COVID-19 and Data Privacy

The measures businesses have taken to adapt to the COVID-19 crisis are unlikely to be temporary, which includes the consequences for personal data processing operations.

In this article, I have gathered some key insights from various industry players and experts, and reconciled them with my own industry observations and the observations of the wider Aphaia Data Protection Officer (DPO) team.

More opportunities mean more data

It is by now clear that the lack of offline opportunities has created a number of online opportunities in those industries that do not necessarily depend on physical, hand-to-hand delivery. This is best described in the words of Susana Cárdenas, founder of award-winning heritage Cárdenas Chocolate: “It’s incredible how this current situation has changed our business. For example, hotels and restaurants are closed and sales on hold. However, our clients who sell online have sold out our chocolate because people ordered it as a gift for their loved ones during the quarantine. That was the case of chocolate online retailers in Paris and Spain.”

An even more prominent change can be observed in the art market. According to Arianna Perini, an arts management professional, “Auction Houses have only had online sales in the past three months, plus most global galleries have opened online viewing rooms to enable buyers to still ‘visit’ and buy their artworks. A good example comes from Gagosian and Zwirner.” Ms Perini further notes that many startups have taken advantage of this situation with their ideas to digitise and democratise the art market, an example being Vortic, launched by Victor Miro in London. Even though the possibility to physically experience art is timeless, she foresees a permanent shift towards the online even after the situation has gone back to normal.

Whereas such developments may for experienced online sellers mean a bonanza not only in profits but also in customer data, all those who have only now discovered online sales channels may be caught off-guard when it comes to the requirements for lawful data processing. Whereas some businesses have set aside basic compliance requirements such as transparency of their customer data processing, we have also seen others who fear that anything they do with the data might be unlawful, for example that any processing of their customer data might require GDPR consent.

Moving your business online should not be taken too lightly

That said, whilst moving your business online has been much easier now than it would have been only a couple of years ago, there seems to have been a huge difference between those businesses whose DNA has comprised of online work before COVID-19 pandemic, and others where this is not the case. That type of DNA often has very little to do with businesses having to move around physical goods or perform services on-site or on a person.

Aphaia have launched our DPO Outsourcing product that is primarily based on collaborative client interaction on Trello in 2017, when the privacy industry was still based on privacy consultants spending long hours at the clients’ premises, a business model unsustainable for tech startups and young tech businesses, who are now by GDPR typically required to appoint a DPO. But if the markets might have allowed some of these legacy business models to continue, COVID-19 has put an end to them. If your business can operate online, it now must operate online.

According to Olga V. Mack, CEO at Parley Pro, a collaborative & intuitive contract platform, legal services’ migration to work from home might face what might at first glance appear rather basic challenges: employees having laptops and reliable access to the internet, plus the company having a plan for disasters. We may as well add the use of secure, cloud-based services that enable end-to-end encrypted sharing of data and operate based on Article 28 GDPR-compliant terms of service.

Unwanted consequences of the surge in online business

Unfortunately, new opportunities for lawful business also mean new opportunities for illegal activities targeting people’s data and property. The effects of this have been more tangible than one might think. According to Pamela Mcloughlin, Head of Digital Money & New Ventures at Hello Soda, COVID-19 pandemic has caused a peak in e-commerce traffic, but a rise in e-commerce comes with a rise in fraud: “This is not even industry specific; we have noticed that a lot of our clients have seen a rise in fraud. We as a KYC, anti-money laundering and ID verification vendor have had to respond by managing an influx of businesses who need to integrate our anti-fraud tools urgently and also automate their ID verification processes, which were previously manual.”

Needless to say, businesses that process financial data and special categories of data, notably health-related data, need to ensure that communications channels and storage mediums they use are secure to prevent any potential personal data breach.

Our tips to get your data privacy right post-COVID-19

Whilst privacy-related consequences of COVID-19 pandemics vary from business to business, there are some universal take home messages:

  • if you have previously engaged in online sales of goods and services only from time to time but are now doing so regularly, you should review your online privacy policy – or prepare one if you do not yet have it. You might be surprised, but there are customers who do read such documents and make complaints to the data protection authorities if they do not like what they see!
  • review your communications channels to ensure that key transactions information is adequately encrypted;
  • review your list of sub-processors whom you use to store in the cloud, analyse, or send your customers’ and employees’ information. Each of them must have Article 28 GDPR-compliant terms of service in place. No excuses, no exceptions;
  • if your employees are using their own devices (BYOD) to work from home, please ensure they have installed appropriate malware protection;
  • list the obligations and procedures to protect personal data by your employees in an internal data protection policy;
  • make sure you are ready for a potential data breach, which needs to be reported to the data protection authority within 72 hours of discovery.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Legislative enforcement and AI

Regulating the right to privacy in the AI era

What about Privacy in the AI era? New developments in 2019 have shown that the GDPR rules on AI profiling could not be timelier. From smart billboards to home audio devices, AI has been deployed to make sense of everything we expose about ourselves, including our faces and things we casually say. Regardless of these developments, that on numerous occasions have raised concerns, legislative enforcement in the field has been somewhat slow. Will 2020 be the year when privacy regulation finally hits back?

AI technology

Despite toughening legislation, there still seems to be a clear bias towards technology deployment, irrespective of whether its implementation meets compliance requirements. Worth noting that technology, as such, is rarely ‘non-compliant’ but rather the way it’s used that raises issues.

Take smart billboards capable of reading your facial features that have been deployed at numerous busy, publicly accessible locations in 2019. Have these projects all undergone a General Data Protection Regulation (GDPR) privacy impact assessment, as required by law? One should note that video monitoring of a public space in itself bears considerable privacy risks. When adding real-time analysis of your facial features to such video monitoring, the GDPR clearly gives you the right to object to profiling. If we disregard the obvious difficulties of expressing your objection to a billboard on a busy street, how will your objection to any such profiling in the future be observed next time you pass by?

Machine learning enables us to make increasing sense of vast amounts of data. If they haven’t already, the solutions deployed in 2020 are projected to feel even more intrusive. Ironically, however, this might not be applicable where certain smart systems, put in place to learn to provide more subtle, less visibly intrusive and therefore a more effective link between our preferences and commercial offers served to us, are concerned. This might help us understand which aspect of targeted advertising we loathe more: privacy intrusion or its clumsy implementation.

The law and AI

The notion that the law is simply ‘unable to keep up with technology’ is not only an inadequate response to the problem but is also largely unfounded as a claim. The GDPR includes specific provisions on profiling and automated decision-making, specifically tailored to the use of artificial intelligence in relation to the processing of personal data. Such processing is subject to the right to obtain human intervention and the right to object to it. Additional limitations in relation to special categories of data also exist. Certain non-EU countries have started adopting similar GDPR principles including the likes of Brazil who passed the General Data Protection Law (LGPD) in 2018.

The California Consumer Privacy Act (CCPA), while less focused specifically on AI, empowers consumers by enabling them to prohibit the ‘sale of data’. This is by no means insignificant. Without the possibility to compile and merge data from different sources, its value for machine learning purposes arguably decreases. Conversely, without the ability to sell data, incentives to engage in excessive data analytics can somewhat dissipate.

When it comes to a broader framework for the regulation of artificial intelligence, the legal situation is for now less clear. Principles and rules are currently confined to non-binding guidelines, such as EU Guidelines for Trustworthy AI. But this does not impact the privacy aspects where European regulators are already able to impose fines of up to up to €20 million or 4% of the companies’ global turnover. CCPA fines are lower but might be multiplied by the number of users affected.

The AI regulatory landscape

Early in 2019, the French data protection authority CNIL imposed a fine of €50 million on Google for insufficient transparency in relation to targeted advertising. As noted by CNIL, “essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalisation, are excessively disseminated  across several documents, with buttons and links on which it is required to click to access complementary information.” Whereas the fine was far from the upper limit imposable via the GDPR, the case paves the way for further questions to be asked by data protection authorities in 2020.

For example, are machine-learning algorithms and the data sources used for them sufficiently explained? When the data protection authorities seek answers to such questions, will they rely on the information provided by companies? Alternatively, they might start digging deeper based on anecdotal evidence. How come the user is seeing a particular ad? Is this based on a sophisticated machine-learning algorithm or analysing data that should not have been analysed?

So far, privacy legal battles have largely focused on formal compliance, such as in both ‘Schrems’ cases. But AI usage trends in 2020 might force regulators to look deeper into what is actually going on inside home-based and cloud-based black boxes. As I write this article, the EU has just moved to impose a temporary ban on facial recognition in public places.

Does your company use artificial intelligence in its day to day operations? If so, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalties. Aphaia’s data protection impact assessments and Data Protection Officer outsourcing will assist you with ensuring compliance.

This article was originally published on Drooms blog.