Amazon facing lawsuit in Germany

Amazon facing lawsuit in Germany, accused of breaking EU’s privacy laws.

Amazon facing lawsuit in Germany after being accused of breaking EU’s privacy laws against the EU-US Privacy Shield.


The global giant Amazon is currently facing a lawsuit and has been accused of breaking the privacy laws in Europe, according to this recent article from Politico. The company has been accused of using the infamous Privacy Shield despite its previous invalidation in Europe which has led to this lawsuit. The basis is that the Court of Justice of the European Union made clear that transferring data through the Privacy Shield was no longer allowed following July’s Schrems II judgment. This ruling invalidated the EU-US privacy shield. The reason for the invalidation was that the CJEU decided that shipping data outside of the EU put it at risk. According to the CJEU, US surveillance customs are more intrusive than they should be and go beyond what is acceptable for privacy. While Amazon understands that the Privacy Shield is invalid, it appears that they have continued to use this invalidated transfer mechanism.

Standard Contractual Clauses are still a viable option for companies needing to transfer data.

Standard Contractual Clauses (SCCs) are another option for the technological giants and are used by the likes of Google and Facebook. The difference is that exporting data from the EU using the SCC requires more supervision, and better ensures the safety of the data. While the SCC gives these companies an alternative, the clauses come with caveats, and are not entirely free of problems. Right now, the giant Facebook stands against the Irish data regulators regarding their use of the clauses.

EuGD takes legal action against Amazon.

EuGD (Europäische Gesellschaft für Datenschutz) decided to take action putting forth the formal legal complaint that escalated the conflict. The recent article by Vincent Manancourt, features a statement from Johann Hermann, the current head of EuGD, the group behind the legal complaint. “The [Court of Justice of the European Union] has made it clear that data transfers to the U.S. on the basis of the Privacy Shield are no longer permitted. If the world’s leading cloud company and largest e-commerce provider remains inactive for more than two months and ignores consumer rights, that is unacceptable,” said Mr Hermann, head of Europäische Gesellschaft für Datenschutz (EuGD). Moreover, the founder of EuGD, Thomas Bindl, said that taking the legal route was a decision made taking into consideration similar conflicts.

Despite the noise and controversy surrounding the conflict and impending lawsuit, it is still necessary to wait and see the developments in court. However, regardless of the result in the ruling, this will likely inspire greater vigilance and compliance on the part of companies who may also be transferring data out of Europe.


Do you make international data transfers to third countries? Are you affected by Schrems II decision? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We also offer CCPA compliance services. Contact us today.


Amazon launches new technology

Amazon launches new technology which scans palms for identification and payment.

Amazon launches new technology in two of its physical stores, which allow for contact free identification and payment, by scanning an individual’s palm.


Amazon is on the verge of launching a new biometric payment system which scans an image of customers’ palms, according to this new BBC article. This new methodology is an attempt at a contactless replacement of traditional membership and physical loyalty cards. The accuracy and unique identifiers lie within the vein patterns in the hands of individuals, which still remain fairly inconspicuous to the naked eye. These scanners would require the customer to wave their palm a few inches away from a scanner making it a viable contactless form of ID/payment simultaneously. The system is currently being tested at two Amazon stores. Physical bills and data will be stored locally at the stores, but will not be sent to Amazon data centers, and clients will be allowed to delete the data from their website.


Amazon developers think this technology is safer and more secure than other methods of biometric identification. 


The application seems to be as accurate and effective as fingerprints, but not as easily identifiable by human vision, and therefore presumably more difficult to replicate. Amazon developers claim it is more secure than other forms of biometrics, which is especially relevant after issues with racial bias have been shown in the company’s facial recognition software that has currently been suspended by officials. Recently, we published an article on The National Biometric Information Privacy Act, which was introduced into US congress. Bills like these are an attempt to curtail any negative effects or security breaches that may arise in using biometric scanners and similar technology.


While this technology is convenient, some point to possible data security risks.


In the midst of the pandemic, the introduction of a new payment method requiring less human interaction, and no physical contact seems like a much needed innovation, however some groups are advocating against biometric forms of ID and payments due to the possible privacy issues associated with biometric data being stored by governments or large corporations. Director of the privacy rights groups Big Brother Watch, Silkie Carlo says that this new technology is invasive, unnecessary and provides just another outlet for Amazon to cultivate personal data freely despite privacy laws and agreements. 


The convenience of biometrics is not overshadowed by the possible invasion of privacy it risks, as a direct consequence. The implementation of these scanners in many different buildings is being discussed if this initial trial in Seattle locations goes well. This technology is a part of Amazon’s vision of a non human staffed supermarket, where everything is tracked by AI and machines in the store and payment can be completed using this new palm scanner for a full contactless experience.


What does the GDPR say about this type of data processing?


The scans being picked up by these machines fall under biometric data, the processing of which is prohibited, under the GDPR, unless certain conditions are met. When processing biometric data, unless at least one of those conditions are met, the processing is deemed unlawful. Article 9 of the GDPR dictates that one of the following criteria must be met in order for the processing of biometric data;


  1. Explicit consent to process that personal data has been given by the data subject for one or more specified purposes, except in instances where union on member state laws prevent the prohibition from being lifted by the data subject.
  2. Processing the biometric data is necessary for the purposes of fulfilling obligations or exercising specific rights of the controller or the data subject in the field of employment, social security or social protection law.
  3. The processing is necessary to protect the vital interests of the data subject or another natural person if the data subject is physically or legally incapable of giving consent.
  4. The processing of biometric data is carried out in the course of its legitimate activities with appropriate safeguards by a foundation, association or any other not-for-profit body, on condition that the processing relates only to members or former members of the body,  or with a person’s in regular contact with the body, in connection with its aim or purposes related to political philosophical religious or trade unionism.
  5. The processing is relating to personal data which is manifestly made public by the data subject.
  6. The processing is necessary for the establishment, exercise or defence of legal claims.
  7. The processing is necessary for reasons of substantial public interest, including in the area of public health.
  8. The processing is necessary for the purposes of private or occupational medicine.
  9. The processing is necessary for archiving purposes in the public interest, whether scientific, historical or statistical purposes.


For more clarity on what is classified as biometric data as well as other aspects of this technology, check out our post; 14 common misconceptions on biometric identification and authentication debunked.

Does your company process biometric identification data? Aphaia provides a number of services in relation to compliance with regard to data protection, including regarding biometric data: data protection impact assessments, Data Protection Officer outsourcing, and EU AI ethics assessments. Get in touch today to find out more.

Lincolnshire Police Trial CCTV

Lincolnshire Police Trial CCTV: this technology can even detect moods!

Lincolnshire police trial CCTV technology which can detect moods, eyewear and headgear, but not  before a human rights and privacy assessment is carried out.


Lincolnshire police will soon debut their trial of CCTV cameras in Gainsborough. This is a new, more complicated and potentially controversial type of Surveillance technology. Although the funding for this project has been approved and received, due to privacy concerns surrounding the use of this technology, the implementation of the new equipment is at a standstill. Key legal considerations need to be made before this could be released and used in the general public, as this technology has the ability to search for persons using parameters surrounding their mood, or their apparel such as hats or sunglasses. Due to the fact that the police have full control of the search parameters; the technology is inherently problematic, as was in case of court rulings as recently as 2018. 


A Welsh national had brought a legal case against the authorities for their use of a very similar facial recognition technology, and this has raised the specter of many ideological and privacy concerns when it comes to the Police having unquestionable access to intrusive means of surveillance, and monitoring persons who may not be suspected or involved in any crimes. Although Mr. Bridges did not have instant success with his claim, as his first petition to the High Court was denied, in his subsequent Court of Appeal claim; three out of five of the unconstitutional breaches of privacy Mr. Bridges presented were ratified as legally valid in the court. 


The police have acknowledged, and made attempts at addressing the public’s privacy concerns regarding the use of this technology.


Privacy concerns are a very important consideration prior to the establishment of this new technology for everyday use. The police have tried to give some assurance to the public that their rights are of paramount importance  in the means and the protocols surrounding this technology and how it is used. The local police have also released some preliminary information which may ease public anxiety around the implementation of this technology; the scans are not being done in live time and also, all footage is deleted after 31 days. 


Legislation continues to be introduced regarding privacy and surveillance.


There are also larger debates surrounding the appropriate search terms allowed and under what circumstances they can be implemented in a situation where this new surveillance technology is to be in use. Legislation around government surveillance also has seen changes in recent years since the Ed Bridges case, and it continues to be reformed, in an attempt to encompass everyone’s well-being without stripping them of the fundamental privacies and rights allotted to them. 


According to Cristina Contero Almagro, partner at Aphaia, ‘The risk is twofold: first, the police using the technology without the appropriate safeguards and second, the information being compromised and used maliciously by third-parties which may access it unlawfully. Considering the nature of the data involved, it is essential to put in place strong security measures which ensure the data will be adequately protected. It is important to note that once that biometric information has been exposed, the damage to the rights and freedoms of the affected data subjects is incalculable, as it is not something that can be changed like a password’.


‘Any facial recognition that includes profiling should be viewed with suspicion,’ comments Dr Bostjan Makarovic, Aphaia’s Managing Partner. ‘The challenge is that there is no way to object to such profiling because it takes place in real time as one enters a certain area. Law enforcement and public safety are important reasons but should not be used as a blanket justification without further impact assessment.’  

Does your company utilize facial recognition or process other biometric data? If yes, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalty. Aphaia provides both GDPR adaptation consultancy services and CCPA compliance, including data protection impact assessments, EU AI Ethics assessments and Data Protection Officer outsourcing. Contact us today.

Twitter Data Case Dispute

Twitter Data Case Dispute: European Union privacy regulators conflicted over how much, or whether—to fine, over last year’s data breach.

Twitter Data Case Dispute between European Union privacy regulators, causing delay in the progress of the most advanced cross-border privacy case involving a U.S. tech company under the GDPR.

The Twitter data case dispute, disclosed in a statement from Ireland’s Data Protection Commission, is one of the first major tests for enforcement of the GDPR. This has raised concern over other possible disagreements and delays in nearly two dozen other investigations into Facebook, Google, and other U.S. tech companies. This particular case concerns a security hole that Twitter claimed to have fixed in January 2019, which exposed the private tweets of some users, over a period of over four years.


This Twitter case dispute will be an early indication of how similar situations of power sharing among EU regulators.

The outcome of this Twitter case will be an early indication of how the EU’s power-sharing system among regulators will work in practice. Because Twitter has regional headquarters in Ireland, the investigation is led by Ireland’s data commission. However, cases can be objected to, by regulators in any of the 26 other EU countries involved. Under the GDPR, in cases that involve multiple countries, the lead regulator (in this case Ireland’s data commission), sends its draft decision to counterparts. They then have four weeks to submit objections, then there is additional time left to approve revisions based on those objections.  Any disagreements the regulators can’t resolve can be referred to the European Data Protection Board, which decides by way of a vote. Once the board approves a decision, the lead regulator will inform the company of that decision within a month. The voting process can take from a month to two, or two and a half months, depending on whether extensions are granted.

After consultations, with other EU authorities, there remained a number of objections, triggering the first ever dispute resolution. 

The Irish privacy regulator mentioned that it had triggered a dispute-resolution mechanism among the bloc’s privacy regulators due to a failure to resolve disagreements over its draft decision in the Twitter case. This is the first time that process has been triggered. Ireland’s data commission forwarded a draft decision to its counterparts for comments in May. 

The commission engaged in consultations with other regulators to resolve their complaints. Graham Doyle, a deputy commissioner said that despite the consultation, a number of objections remained and the matter has now been referred to the European Data Protection Board, by the Data Protection Commission. 

Under the GDPR, companies can be charged a sliding scale up to 2% of their annual revenue, considering various factors for this type of violation. 

Ireland’s data commission said that the focus of this case is on whether Twitter met its obligation for a timely notification of the data breach. Under the GDPR, regulators can fine companies up to 2% of their world-wide annual revenue for failing to notify them of a data breach within 72 hours.  This could amount to up to $69 million, based on Twitter’s 2019 revenue. However, this legislation also directs regulators to take into account the gravity and duration of the violation, the type of personal information involved, and also other factors, like whether the violation was intentional or not. This opens up lots of room for disagreement between regulators on how much should be charged for violations.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.