Real time bidding, programmatic advertising and privacy risks

Our Vlog this week covers adtech and real time bidding (RTB).

Real-Time Bidding is a set of technologies and practices used in programmatic advertising that allow advertisers to compete for available digital advertising space in milliseconds, placing billions of online adverts on webpages and apps by automated means.

In a nutshell, our fingerprint generates a lot of data about our activity on the internet. This data is collected by the advertisers and we are targeted according to it. The website publishers, from their side, auction in real time a space on the page we are viewing, and then the publishers bid for such space in order to display ads we may be interested in.

Does this comply with the GDPR? The ICO has recently launched a report about this aiming at addressing the main challenges that come from the use of RTB.

  • Most of them are related to transparency and consent:
    • identifying a lawful basis for the processing of personal data in RTB remains challenging, as the scenarios where legitimate interests could apply are limited, and methods of obtaining consent are often insufficient in respect of data protection law requirements;
    • the privacy notices provided to individuals lack clarity and do not give them full visibility of what happens to their data;
    • the scale of the creation and sharing of personal data profiles in RTB appears disproportionate, intrusive and unfair, particularly when in many cases data subjects are unaware that this processing is taking place; and
    • it is unclear whether RTB participants have fully established what data needs to be processed in order to achieve the intended outcome of targeted advertising to individuals.
    • In many cases there is a reliance on contractual agreements to protect how bid request data is shared, secured and deleted. This does not seem appropriate given the type of personal data sharing and the number of intermediaries involved.

RTB carries a number of risks. These include:

  • profiling and automated decision-making;
  • large-scale processing (including of special categories of data);
  • use of innovative technologies;
  • combining and matching data from multiple sources;
  • tracking of geolocation and/or behaviour; and
  • invisible processing. Beyond these, many individuals have a limited understanding of how the ecosystem processes their personal data.

These issues make the processing operations involved in RTB of a nature likely to result in a high risk to the rights and freedoms of individuals. Many of the above factors constitute criteria that make data protection impact assessments (DPIAs) mandatory.

In our view, and especially considering the new ICO guidance on cookies, controllers should take some actions previous to the processing, as putting in place a DPIA and gathering consent for RTB. RTB should have a separated explanation and toggle in the pop-up and settings, the same as it is required for non-essential cookies.

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

Sweden’s first GDPR fine

A School in Sweden has been charged by the Swedish DPA a fined of 200 000 SEK (approximately 20 000 euros) for using facial recognition technology to monitor the attendance of students in school.

22 students’ participation in class was captured by a camera using facial-recognition software. This trial was conducted to determine if it could be used as a standard procedure to cut down on class time.

The faces and full name of students were captured through biometric data. The data was stored in a local computer without an internet connection, and placed in a locked cabinet. Consent was gathered from the guardians and the school gave the participants the option to take back consent and stop the trial. However, neither a risk assessment nor prior consultation with the Swedish DPA was executed.

GDPR was violated in three ways:

  • Violation of the fundamental principles of Article 5 by processing personal data in a more integrity invasive manner than necessary relative to the purpose (attendance)
  • Article 9 by processing sensitive personal data (biometrical data) without legal basis
  • Articles 35 and 36 by not fulfilling the requirements of data protection impact assessment and prior consultation.

Even though, the school maintains it had its students’ consent, the DPA found there was no valid legal basis for this as there’s a clear imbalance between the data subject and the controller.

When it comes to the workplace, Spanish DPA, AEPD, rules that the controller can gather biometric data (e.g.fingerprint) for attendance control purposes as long as some principles and requirements are met, mainly purpose limitation and data minimisation, among others.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Fingerprinting and what it means for privacy?

This week we discuss device fingerprinting.

Firstly though we want to know do you feel safe against online identifiers? Do you frequently delete cookies?

It’s time to up your game and here’s why…

what is fingerprinting?

Beyond cookies or pixels, there are other techniques of identification and monitoring on the Internet. While it can be done for a legitimate purpose such as enabling multiple-authentication mechanisms, it can also be used for tracking and profiling, with the ultimate goal of exploiting such data, although initially the information is collected with a technical purpose.

Privacy is affected by fingerprinting and here is how:

-Given that people usually tend not to share their devices, singling out  a device allows the identification of an individual, which points out the need for applying Data Protection rules.

-An additional concern comes from the possibility to reassign the linked information to the user even when cookies have been deleted.

An individual can be identified using fingerprinting and there are 3 main elements, which allow the identification of a singular device, which are:

-Gathering data.

-The Global nature of the Internet.

-A Unique ID.

Fingerprint risks are covered by GDPR under recital 30, which generically refers to online identifiers, which means data protection rules apply directly.

Tips for users:

 -Set up your preferences in the browser settings.

-Opt-in to the Do Not Track mechanism, which will allow you to disable web tracking on the device.

Tips for data controllers using fingerprinting:

-Check DNT preferences before processing any data.

-Gather users’ consent even where DNT is disabled

-Include fingerprinting in the record of processing activities.

We advise you to:

-Carry out a risk analysis and Data Protection Impact Assessment where relevant, considering the impact of the disclosure of profiling information contained in the database.

-Avoid the use of social, cultural or racial bias leading to automatic decisions.

-Create access controls for employees or third parties to specific users’ data.

-Avoid the excessive collection of data and retention for excessive periods.

-Consider the impact on the perception of the freedom of use of profiling information.

-Avoid the manipulation of user’s wishes, beliefs and emotional state.

-Lastly in relations to the above, consider the risk of re-identification.

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

 

Apple to protect children’s privacy

Even though it could devastate their businesses, Apple has decided to change the rules it has for kids apps.

Under the new rules, kids apps on Apple’s App Store will be banned from using external analytics software — invisible lines of code that collect extremely detailed information about who is using an app and how. Apple is also severely curtailing their ability to sell ads, which underpins the business model that results in many apps being free. The changes were prompted in part by some children viewing inappropriate ads, Apple says.

Making all these changes is part of the move to better protect users’ privacy by shielding children from data trackers, a move that has been lauded by some privacy advocates. But some are worried that instead of protecting kids, the new rules will be possibly expose them to more adult apps.

A few app makers are worried that the new rules could limit the ability of their apps ads and they would needs to leave the models that they are currently using that makes their apps free. Apple says it was simply responding to parents’ concerns. Phil Schiller, Apple’s senior vice president of worldwide marketing, said parents were complaining to Apple about inappropriate advertising shown to their kids while using iPhone apps. “Parents are really upset when that happens because they trust us,” Schiller said.

Under the new rules, developers of mobile apps don’t have to stop collecting data themselves. (Apple’s own analytics software is also not banned, according to the new rules.) And once they collect the data, Apple can’t see what they do with it, such as send it to a server, where it can be analyzed by outside parties. In some sense, Apple could be making the problem worse by pushing data collection into the shadows, according to developers and people who work at analytics companies.

Apple’s App Store is already under the antitrust microscope. The company is facing a European investigation into allegations made by Swedish music app Spotify that Apple unfairly tipped the scales on the App Store in favor of Apple Music, a similar service. And the Supreme Court in May allowed a lawsuit to proceed that accuses Apple of using monopoly power to inflate app prices.

Kids apps are estimated to make up only a small portion of the millions of apps available in the store, though Apple declined to say what percentage they are. It’s unclear exactly how many of those are collecting personally identifiable data on kids, and Apple declined to quantify how many are behaving badly.

Privacy advocates have been complaining for years about the problems Apple says it is trying to fight. The 1998 U.S. Children’s Online Privacy Protection Act and the newer European General Data Protection Regulation limit what data kids apps are able to track.

According to Cristina Contero Almagro, Aphaia Partner, “although this is definitely a step in the right direction, it remains to be seen how it applies in practice. These new rules show a theoretical concern of Apple, which is one of the Internet giants, about privacy, but data protection is more than written rules. With their own analytics software still allowed, children data will keep being collected, thus exposed to misuse. And, what is worse, if there is no control over how and to who the app developers transfer this data to external systems, the individiuals cannot exercise their data protection rights properly, what would be an unacceptable limitation of the GDPR”.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.