Data protection standards for adtech outlined by ICO

Data protection standards for adtech have been outlined by the ICO in order to ensure that companies safeguard people’s privacy online.


The ICO has called on various companies to address and eliminate the existing privacy risks associated with the adtech industry. The Information Commissioner recently published an opinion warning companies that are designing novel methods of online advertising, that compliance with data protection laws is paramount, and that the excessive collection and use of personal data needs to be curbed. While online advertising has developed to the point of being very targeted and therefore much more successful, this has been made possible by the availability and use of lots of personal data, allowing ads to be customized to the audience. Personal data must remain protected and therefore compliance to laws and standards relating to the protection of personal data is imperative.



The UK’s Information Commissioner believes that market participants should aim for solutions that are focused on individuals’ rights, freedoms and interests.


The ICO, in its recently issued opinion, calls for a move away from intrusive tracking technologies as these are likely to continue to pose risks and test compliance. The opinion asks companies to “embody the core concepts of data protection by design and by default, and not reinforce or replicate intrusive practices.” The Information Commissioner lists five key principles upon which any solution, proposal or initiative should be built in order to support the key considerations for design, documentation, accountability and auditability. These principles include data protection by design, user choice, accountability, purpose, and reducing harm. These principles are to be considered holistically, and any proposals should demonstrate clearly how they are being applied.


In order to uphold the data protection standards for adtech, the ICO provided recommendations for more specific guidance.


The ICO has provided several specific recommendations for companies who use adtech, to ensure that they not only remain in compliance but also keep the rights and freedoms of individuals as a priority. The UK watchdog recommends, explaining and demonstrating design choices in the architectural design decisions for solutions, ensuring the organizations that implement these solutions are sufficiently enabled to integrate the necessary safeguards. The ICO also makes it clear that the benefits and outcomes of these solutions need to be fair and transparent. Data minimization remains important as a general rule, as well as maintaining the need to protect users. The ICO recommends giving users meaningful control, and provides, in this recent opinion, steps to ensure that user control is strengthened and takes significance over processing in solution design.


The principles of proportionality and necessity must be considered and organisations should be able to demonstrate that they cannot reasonably achieve the required purpose in any less intrusive way, in order to justify the impact on individuals. Solutions must allow organizations to easily identify and meet the requirements of appropriate lawful basis, identifying where PECR requires consent, and where consent meets the GDPR standard. In addition, solutions must particularly address the potential for processing special category data, and allow organisations to identify the appropriate condition under which it is being processed. The aim is to allow new online advertising proposals to improve trust and confidence in the digital economy, rather than threaten that.



The Information Commissioner welcomes further input, and reserves the right to revise the views therein, based on further findings.


The information commissioner reserves the right to form a different view based on further findings, changes in circumstance and engagement with stakeholders. That said, the ICO is open to receiving further input that may help in understanding these developments from the perspective of data protection, or help market participants understand the broader data protection impacts of their proposals or how they may better incorporate data protection by design and default into their services.


Does your company have all of the mandated safeguards in place to ensure the safety of personal information collected on your website or app? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Children’s Code compliance called to question by the ICO

Children’s Code compliance was called to question by the ICO after an online child safety charity raised questions about various companies.


The ICO has written to Apple and Google for clarification on their process of determining age ratings for apps available in the App Store and Google Play respectively. Since the introduction of the Children’s Code, it has been a top priority to ensure children’s privacy, protection and safety online. Recently, concerns were raised by an online child safety charity,5Rights Foundation which caused the UK Information Commissioner to call into question the tech giants’ assessment of the apps as reported by TechCrunch.


After systemic breaches were discovered by the 5Rights Foundation, The ICO has called on Apple and Google to clarify their determination process for app age ratings.


The charity conducted research over the summer and found systemic breaches which included insufficient age assurance, mis-advertisement of minimum ages for games on app store, the use of dark patterns and nudges, data-driven recommendations that create risks for children a routine failure to enforce community standards, low default privacy settings, among others. A letter was then written to the UK’s Information Commissioner expressing concern over the safety of children using these platforms and their respective apps. In a recent statement, the Information Commissioner expressed that the office of the ICO is currently conducting an “evidence gathering process to identify conformance with the code, and thus compliance with the underlying data protection law”.


The Information Commissioner, Elizabeth Denham has written in response to the 5Rights Foundation stating “In this process, the ICO is taking a systemic approach; we are focusing our interventions on operators of online services where there is information which indicates potential poor compliance with privacy requirements, and where there is a high risk of potential harm to children.” In this letter the Information Commissioner also reported that the ICO has contacted Apple and Google “to enquire about the extent to which the risks associated with the processing of personal data are a factor when determining the age rating for an app”.


Close to 50 companies are currently being called to answer on their Children’s Code compliance.


The ICO has not only written to Apple and Google on this matter, but has also written to a total of 40 organizations across the three tech sectors which are considered to present the highest risk for kids – social media or messaging; gaming, and streaming platforms. This is in an effort “to determine their standards of conformance individually”. The ICO intends to write to nine more companies following the highlighting of other concerns by the charity. The full list of companies being targeted with questions on Children’s Code compliance has been published neither by the ICO, nor 5Rights Foundation. However, most of these companies have already been contacted.


The 5Rights Foundation is calling on the ICO to ensure Children’s Code compliance in practice.


While the Children’s Code came into force on September 2nd, the standards were published well over a year ago, providing a grace period during which companies were expected to come into compliance.The 5Rights Foundation has expressed concern that the Code may be misinterpreted a handful of safety measures, rather than a requirement for a holistic re-design of the systems and processes of services ensuring their data collection practices are in the best interests of children. The foundation reiterated in its letter that “If the Code is to have real value in protecting children’s safety and rights in the digital environment, the ICO must make sure that it is respected in practice.”






Does your company have all of the mandated safeguards in place to ensure the safety of Children who may use your website or app? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Privacy class action lawsuit against Google halted by UK Supreme Court

A privacy class action lawsuit against Google has been halted by the UK Supreme Court as claimant is unable to prove damage to affected users.


A billion dollar class action lawsuit against tech giant Google has been denied by the UK Supreme Court. The case, originally filed by Richard Lloyd, on behalf of a group called “Google, You Owe Us” relates to the unlawful tracking of millions of iPhone users. Between August 2011 and February 2012, Google allegedly bypassed iPhone security and collected personal data through the Safari browser. The lawsuit was filed on behalf of 4.4 million residents of England and Wales, claiming £3 billion in damages. However this case has been dismissed due to the fact that the claimant was unable to prove any damage to the individuals by Google’s alleged unlawful tracking and data collection, according to this report from IAPP.


The judge dismissed the privacy class action lawsuit, stating that the affected individuals suffered no material damage or distress as a result of the breach.


The class action, previously dismissed in 2018, but subsequently overturned by the UK Court of Appeal has now been dismissed by the UK Supreme court. The judge in this case, Judge George Leggatt concluded that there was no evidence of damage suffered by the individuals affected by this breach. Judge Leggatt said “The claimant seeks damages, for each individual member of the represented class without attempting to show that any wrongful use was made by Google of personal data relating to that individual or that the individual suffered any material damage or distress as a result of a breach.” Members of the public have expressed outrage at this ruling, claiming that it undermines equality, and that not enough has been done to protect the right of the individual against large tech firms like Google which break the law and put the personal data of citizens at risk.



Privacy experts have been following this case very closely, due to the implications the ruling would have on other class actions in the UK.

As similar cases circulate, privacy experts have been in a state of anticipation for the outcome of this class action lawsuit, knowing that the result of this may have far reaching implications. One such case is that of TikTok being accused of using children’s data without informed consent, as reported by BBC. Lawyers claim that TikTok takes children’s personal information, including phone numbers, videos, exact location and even biometric data, with neither adequate warning and transparency, nor the necessary consent required by law. Allegedly, children or parents are not being made aware of what is being done with that information. TikTok has called these claims baseless and expressed its intent to fight them.



“This case stems from the right to compensation provided by the (UK) GDPR, whereby any person who has suffered a material or non-material damage as a result of an infringement of the (UK) GDPR can claim compensation from the controller or the processor. As a first step, one should try to obtain compensation by writing to or speaking with the organisation directly. However, if no agreement is reached, a court claim can be made. The seriousness of the breach and the impact on the individual, especially in terms of the distress caused, are two of the determining elements. In order for a controller or a processor to be exempt from this liability, they will need to prove that they are not in any way responsible for the event giving rise to the damage” explains Cristina Contero Almagro, partner in Aphaia.



Does your company have all of the mandated safeguards in place to ensure the safety of the personal data collected or processed? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Facial recognition in UK schools: ICO intends to intervene

Facial recognition in UK schools for collecting meal payments has raised some concerns, over which the ICO intends to intervene.


Nine schools in North Ayrshire are currently using a facial recognition payment system in their canteens. The change happened this month, and schools claim that it helps the lunch line move more quickly, while also offering a contact free method of payment for students during the COVID-19 pandemic. However, this situation has raised much concern, and the ICO deems it necessary to intervene, and encourage a less intrusive approach to collecting contract free payments quickly for student meals.


While fingerprints have been used in UK schools for years, privacy advocates believe that using facial recognition technology is an unnecessary and intrusive step.


The use of live facial recognition technology in schools has been challenged because of issues with consent. While CRB Cunninghams, the company installing the software claims that the parents had to give explicit consent and cameras check against encrypted faceprint templates stored on school servers, many are wondering whether such an intrusive method of collecting payments is indeed necessary. The use of facial recognition in UK schools involves using sensitive, biometric, special category data. According to this article from The Guardian,  Silkie Carlo, the director of Big Brother Watch said “This is highly sensitive, personal data that children should be taught to protect, not to give away on a whim. This biometrics company has refused to disclose who else children’s personal information could be shared with and there are some red flags here for us.” According to the North Ayrshire council, 97% of children or their parents have consented to the use of the new system.


A representative from the ICO has suggested that school officials should consider a less intrusive manner of collecting payments if the same effect can be achieved.


The ICO has been made aware of the situation regarding facial recognition in North Ayrshire schools and intends to make an inquiry with the North Ayrshire council. The ICO is of the view that if the same effect can be achieved using less intrusive methods of collecting payments, then this should be done. Organisations which use facial recognition technology need to comply with data protection law before, during and after the use of the technology. A representative from the ICO said “Data protection law provides additional protections for children, and organisations need to carefully consider the necessity and proportionality of collecting biometric data before they do so.” While parents and guardians may have consented to the use of facial recognition in UK schools for collecting meal payments, the question here is whether this is indeed the best approach for contact free payments from minors.

Do you need additional insight on facial recognition and GDPR specific to your company’s operations? Aphaia can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact AssessmentsAI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.