Amazon faces possible fines

Amazon faces possible fines for alleged GDPR violations

Amazon faces possible fines totaling €350 million for alleged GDPR violations.

 

Luxembourg’s privacy regulator, the CNPD is proposing a fine of at least €350 million on Amazon.com Inc, relating to alleged violations of the GDPR. Before this draft decision can become final, it must first be approved by other EU privacy regulators. A final decision could take months and may result in a fine higher or lower than the proposed amount. This possible fine has the potential to be the bloc’s biggest penalty yet. While the amount is roughly 2% of the company’s reported net income for 2020, and the latest proposed sanction this far, some other EU regulators argue that it may not be enough. The alleged violations are related to Amazon’s collection and use of personal data. 

 

The alleged violations by Amazon are related to the company’s collection and use of personal data. 

 

The draft decision for the sanction has been circulated among the bloc’s 26 other authorities. Because Amazon’s EU headquarters is based in the Grand Duchy, the CNPD, Luxembourg’s data protection commission is the lead authority issuing this fine. The proposed fine is related to alleged violations of the EU’s GDPR, with regard to Amazon’s collection and use of personal data, however this is not linked to his cloud computing business, or Amazon Web services. Months ago, whistles were blown on the tech giant regarding privacy and compliance issues from former information security employees. According to Politico, three individuals were anonymously interviewed and identified as former high level employees of the company, who raised flags over issues relating to the security of customers’ information not being prioritized as it should. Due to the status of legal proceedings however, the privacy regulator was unable to provide very many details on the specifics of the alleged violations being brought against the tech giant. 

 

According to the whistle-blowing former information-security employees, data stored by Amazon is at risk, as there is a lack of clarity on what data is being stored, where it is stored and who can access it. As a result it would be severely difficult for Amazon to fulfill a request from a customer wanting to exercise their right to erasure,as it would be impossible for the company to identify all of the places where every bit of information is stored. Article 17 of the GDPR states that data subjects have the right to request that all their personal data be erased by a data controller, and to have that request fulfilled without delay. Representatives from Amazon maintain that the privacy of its customers is a priority and that it complies with the laws of the countries where it operates. 

 

Amazon faces possible fines of record-breaking status, which could possibly climb higher by the time a final decision is reached. 

 

While the proposed amount of this fine would be a record-breaking fine for EU regulators, due to the size of the company among other factors some regulators feel that this may not be enough. According to the GDPR, a fine of up to 4% of the company’s annual revenue may be imposed for violations. The proposed fine is only 2% of Amazons reported net income for 2020, which totaled approximately €17.5 billion. While the final decision may feature a higher or lower fine, the decision making process, which could take several months, does have the potential to double the proposed fine amount, according to the GDPR. This draft decision is one of many privacy enforcement above being taken against tech giants like Amazon. Ireland’s privacy regulator has also expressed intent to make draft decisions against other tech giants, the likes of which may include Facebook, Google and Apple, which are all headquartered in Ireland. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

EU Cloud Code of Conduct

EU Cloud Code of Conduct approved by the EDPB

EU Cloud Code of Conduct approved by the EDPB to ensure GDPR compliance for the cloud industry in Europe.

Two Codes of Conduct have recently been approved for the cloud industry, to ensure GDPR compliance for cloud services in Europe. Euractiv recently reported that the EDPB has approved Codes of Conduct on cloud service providers and cloud infrastructure last month. EDPB Chair Andrea Jelinek said “We welcome the efforts made by the code owners to elaborate codes of conduct, which are practical, transparent and potentially cost-effective tools to ensure greater consistency among a sector and foster data protection compliance.” The two Codes of Conduct are the first of their kind to be formally approved by data protection authorities and will provide a blueprint for compliance with data protection regulation in Europe.

All Cloud Service Providers are invited to join the EU Cloud Code of Conduct which covers the full spectrum of cloud services.

The new EU Cloud Code of Conduct covers the full array of services- software (SaaS), platform (PaaS) and infrastructure (IaaS). The code was drafted together with authorities of the European Union, and is intended for cloud service providers, to provide guidance for data protection compliance while securing trust from customers for their cloud services. There are various membership options depending on the interest of the Cloud Service Provider, and providers will be able to declare their services as being adherent to the code. The codes are expected to increase transparency and trust in the European cloud computing market. Both Codes will appoint independent monitoring bodies that will ensure their application of the Codes is GDPR compliant. These monitoring bodies will provide external auditing and will be accredited by the relevant data protection authority.

These codes of conduct are expected to boost the cloud computing industry, bringing greater certainty to both EU companies and citizens.

While cloud computing is sill not used by several EU companies, uncertainty around judicial applicability and data protection are seen as barriers to many companies. This major step towards providing clear guidance to EU companies is expected to address those issues, as cloud computing is becoming increasingly popular. As an added benefit businesses will now be able to avoid the uncertainty created by Schrems II, although these codes cannot be used in the context of international data transfers, customers will be able to request the storage of their data within the EU. EU citizens will enjoy the benefits of greater control over their personal data, transparency on where their data is stored, and greatest certainty surrounding the use of their data.

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

AEPD fines EDP Comercializadora

AEPD fines EDP Comercializadora, S.A.U 1.5 million euros

AEPD fines EDP Comercializadora, S.A.U 1.5 million euros for two violations of the GDPR. 

 

EDP Comercializadora, S.A.U, an electricity service provider in Spain has been fined for two violations of the GDPR. The company was found to lack sufficient technical and organizational measures to verify whether someone signing up for its services on behalf of another natural person is indeed authorised to do so, or authorised to process personal data on behalf of the other person. The AEPD also found that in some cases, the company was not providing data subjects with sufficient information related to the processing of their personal data, just by the nature of the informational document provided to data subjects, and the method of providing information. A total of 1.5 million euros in fines was imposed on the company for these violations, in accordance with GDPR Article 83. 

 

AEPD fines EDP Comercializadora, S.A.U €500,000 for a violation of article 25 of the GDPR. 

 

Article 25 (2) of the GDPR addresses the requirement for the implementation of appropriate technical and organisational measures for ensuring the protection of personal data, from the point of collection and throughout the use and storage of this personal data.In addition, the regulation states “In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.” EDP Comercializadora S.A.U was found to lack sufficient measures to avoid and mitigate the risks associated with the processing of personal data in instances where the service is being registered for by a third party. The company was found to lack the technical and organizational measures required to verify firstly, whether a third-party who hires its service on behalf of another natural person has authorization to perform this contracting, as well as whether they are authorized by that person to process personal data on their behalf. In accordance with article 83 (4) (a), the supervisory authority imposed a fine of €500,000 for this infringement. 

 

An additional 1 million euro fine was imposed by the AEPD. 

 

Article 13 of the GDPR outlines comprehensive and specific information to be provided to all data subjects at the point when personal data is collected from them. This information is all required to be provided by the data controller to every data subject from whom data is collected and processed. Upon review of the document provided to data subjects by the controller, EDP Comercializadora S.A.U, information was found to be lacking regarding the controller, the legal basis for processing not based on consent, the purposes of processing relating to profiling on the basis of legitimate interest, and the possibility to object to processing activities that the controller bases on its legitimate interest. In addition, in some of the company’s procedures, for example contracting the company’s services by telephone, the method of access to the information required by the data subject was not simple and immediate. For this, a fine of €1,000,000 was imposed by the AEPD. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

New EU law

New EU law imposes a time limit on tech giants to remove content

New EU law imposes a time limit of one hour on tech giants to remove terrorist content. 

 

Last month, a new EU law was adopted by the European Parliament, forcing online platforms to remove terrorist content within an hour of receiving a removal order from a competent authority. According to a report from Euractiv, this regulation on preventing the dissemination of terrorist content online has faced some opposition and has been called controversial. The European Commission drafted this law on the basis of several terror attacks across the bloc. This, considered a necessary step in combating the dissemination of terrorist content online, came into effect on April 28th, after being approved by the Committee on Civil Liberties, Justice and Home Affairs in January. 

 

The proposed legislation was adopted without a vote, after approval from the Committee on Civil Liberties, Justice and Home Affairs. 

 

On January 11, the committee on civil liberties justice and home affairs (LIBE) approved this proposed legislation. There were 52 votes in favor of this law, and 14 votes against it. A decision was made to forgo a new debate in the chamber, and the proposed legislation was approved without being put to vote in the plenary. Since then, the law has come under critical eyes and some have expressed discomfort with the implementation of this new EU law, without sufficient opportunity for debate. There are several fears that this law can be abused to silence non-terrorist speech which may be considered controversial, or that tech giants may begin preemptively monitoring posts themselves using algorithms. 

 

Critics claim that such a short deadline placed on tech giants could encourage them to use more algorithms. 

 

This law has been called ‘anti-free speech’ by some critics and MEPs were urged to reject the Commission’s proposed legislation. Prior to the April 28th meeting, 61 organisations collaborated on an open letter to EU lawmakers, asking that this proposal be rejected. While the Commission has sought to calm many of those fears and worries, there remains some lingering criticism of this new EU law. Critics fear that the shortness of the deadline proposed on digital platforms to remove terrorist content may result in platforms deploying automated content moderation tools. They also note that this law could potentially be used to unfairly target and silence non-terrorist groups. The critics of this law also stated that “only courts or independent administrative authority is subject to do dishes with you should have the power to issue deletion orders”. 

 

Provisions have been added to the new EU law taking criticisms into account. 

 

In the face of criticism of the new EU law, lawmakers seem to be taking the feedback seriously and have added a number of safeguards to the proposed legislation. It has been specifically clarified that this law is not to target “material disseminated for educational, journalistic, artistic or research purposes or for awareness-raising purposes against terrorist activity”. This was done in an effort to curb opportunistic efforts to use this law to target non-terrorist groups and silence them due to disagreements or misunderstandings. In addition, the regulation now states that “any requirement to take specific measures shall not include an obligation to use automated tools by the hosting service provider” in an effort to deal with the possibility of platforms feeling the need to use automated filters to monitor posts themselves. Transparency obligations have also been added to the proposed legislation, however many critics remain dissatisfied with the modifications. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.