Fingerprinting and what it means for privacy?

This week we discuss device fingerprinting.

Firstly though we want to know do you feel safe against online identifiers? Do you frequently delete cookies?

It’s time to up your game and here’s why…

what is fingerprinting?

Beyond cookies or pixels, there are other techniques of identification and monitoring on the Internet. While it can be done for a legitimate purpose such as enabling multiple-authentication mechanisms, it can also be used for tracking and profiling, with the ultimate goal of exploiting such data, although initially the information is collected with a technical purpose.

Privacy is affected by fingerprinting and here is how:

-Given that people usually tend not to share their devices, singling out  a device allows the identification of an individual, which points out the need for applying Data Protection rules.

-An additional concern comes from the possibility to reassign the linked information to the user even when cookies have been deleted.

An individual can be identified using fingerprinting and there are 3 main elements, which allow the identification of a singular device, which are:

-Gathering data.

-The Global nature of the Internet.

-A Unique ID.

Fingerprint risks are covered by GDPR under recital 30, which generically refers to online identifiers, which means data protection rules apply directly.

Tips for users:

 -Set up your preferences in the browser settings.

-Opt-in to the Do Not Track mechanism, which will allow you to disable web tracking on the device.

Tips for data controllers using fingerprinting:

-Check DNT preferences before processing any data.

-Gather users’ consent even where DNT is disabled

-Include fingerprinting in the record of processing activities.

We advise you to:

-Carry out a risk analysis and Data Protection Impact Assessment where relevant, considering the impact of the disclosure of profiling information contained in the database.

-Avoid the use of social, cultural or racial bias leading to automatic decisions.

-Create access controls for employees or third parties to specific users’ data.

-Avoid the excessive collection of data and retention for excessive periods.

-Consider the impact on the perception of the freedom of use of profiling information.

-Avoid the manipulation of user’s wishes, beliefs and emotional state.

-Lastly in relations to the above, consider the risk of re-identification.

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

 

Apple to protect children’s privacy

Even though it could devastate their businesses, Apple has decided to change the rules it has for kids apps.

Under the new rules, kids apps on Apple’s App Store will be banned from using external analytics software — invisible lines of code that collect extremely detailed information about who is using an app and how. Apple is also severely curtailing their ability to sell ads, which underpins the business model that results in many apps being free. The changes were prompted in part by some children viewing inappropriate ads, Apple says.

Making all these changes is part of the move to better protect users’ privacy by shielding children from data trackers, a move that has been lauded by some privacy advocates. But some are worried that instead of protecting kids, the new rules will be possibly expose them to more adult apps.

A few app makers are worried that the new rules could limit the ability of their apps ads and they would needs to leave the models that they are currently using that makes their apps free. Apple says it was simply responding to parents’ concerns. Phil Schiller, Apple’s senior vice president of worldwide marketing, said parents were complaining to Apple about inappropriate advertising shown to their kids while using iPhone apps. “Parents are really upset when that happens because they trust us,” Schiller said.

Under the new rules, developers of mobile apps don’t have to stop collecting data themselves. (Apple’s own analytics software is also not banned, according to the new rules.) And once they collect the data, Apple can’t see what they do with it, such as send it to a server, where it can be analyzed by outside parties. In some sense, Apple could be making the problem worse by pushing data collection into the shadows, according to developers and people who work at analytics companies.

Apple’s App Store is already under the antitrust microscope. The company is facing a European investigation into allegations made by Swedish music app Spotify that Apple unfairly tipped the scales on the App Store in favor of Apple Music, a similar service. And the Supreme Court in May allowed a lawsuit to proceed that accuses Apple of using monopoly power to inflate app prices.

Kids apps are estimated to make up only a small portion of the millions of apps available in the store, though Apple declined to say what percentage they are. It’s unclear exactly how many of those are collecting personally identifiable data on kids, and Apple declined to quantify how many are behaving badly.

Privacy advocates have been complaining for years about the problems Apple says it is trying to fight. The 1998 U.S. Children’s Online Privacy Protection Act and the newer European General Data Protection Regulation limit what data kids apps are able to track.

According to Cristina Contero Almagro, Aphaia Partner, “although this is definitely a step in the right direction, it remains to be seen how it applies in practice. These new rules show a theoretical concern of Apple, which is one of the Internet giants, about privacy, but data protection is more than written rules. With their own analytics software still allowed, children data will keep being collected, thus exposed to misuse. And, what is worse, if there is no control over how and to who the app developers transfer this data to external systems, the individiuals cannot exercise their data protection rights properly, what would be an unacceptable limitation of the GDPR”.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

FACEBOOK LIKE BUTTON ECJ RULING

The Court of Justice of the European Union, the EU’s highest court, has ruled that an operator of a website that features a Facebook ‘like’ button can be a data controller jointly with Facebook.

What happened?

The EU Court of Justice weighed in on a dispute after an online fashion retailer was accused of violating EU law by embedding a Like plugin. Fashion ID, a German online clothing retailer, embedded on its website the Facebook ‘Like’ button. The consequence of embedding that button appears to be that when a visitor consults the website of Fashion ID, that visitor’s personal data are transmitted to Facebook Ireland. It seems that that transmission occurs without that visitor being aware of it and regardless of whether or not he or she is a member of the social network Facebook or has clicked on the ‘Like’ button.

A German public-service consumer association criticised Fashion ID for transmitting to Facebook the personal data of visitors without their consent, and in breach of their information obligation to visitors regarding the use and disclosure of their data under the Directive.

Decision

The Court finds, first, that the former Data Protection Directive does not preclude consumer-protection associations from being granted the right to bring or defend legal proceedings against a person allegedly responsible for an infringement of the protection of personal data. The Court notes that the new General Data Protection Regulation now expressly provides for this possibility.

The Court found that Fashion ID cannot be considered to be a controller in respect of the operations involving data processing carried out by Facebook Ireland after those data have been transmitted to the latter. It seems, at the outset, impossible that Fashion ID determines the purposes and means of those operations. By contrast, Fashion ID can be considered to be a controller jointly with Facebook Ireland in respect of the operations involving the collection and disclosure by transmission to Facebook Ireland of the data at issue, since it can be concluded that Fashion ID and Facebook Ireland determine jointly the means and purposes of those operations. Overall, Facebook like button ECJ ruling concludes thats websites and Facebook share joint responsibility.

The Court has now made its ruling and concluded that:

  1. With regard to the case in which the data subject has given his or her consent, the Court holds that the operator of a website such as Fashion ID must obtain that prior consent (solely)in respect of operations for which it is the (joint) controller, namely the collection and transmission of the data.
  2. With regard to the cases in which the processing of data is necessary for the purposes of a legitimate interest, the Court finds that each of the (joint) controllers, namely the operator of a website and the provider of a social plugin, must pursue a legitimate interest through the collection and transmission of personal data in order for those operations to be justified in respect of each of them.

According to Dr Bostjan Makarovic, Aphaia Managing Partner, “the Facebook like button ECJ decisions strikes a balance between data subject rights and the commercial realities of web giants’ operations. It is important that the responsibility of the website owner does not extend to further processing of the data by the social network. That said, the assessment of the legitimate interest of the social network in the initial operation might still pose a challenge. Such assessment would best be provided by the social network itself, as part of the standard joint controller arrangement.”

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

A fine of 180K euros for GDPR data breach imposed by CNIL

Active Insurances, has been fined by France’s data protection authority, the CNIL. The amount is 180,000 euros against Active Insurances, CNIL has said that “breached its obligation to secure personal data provided for by Article 32 of the [EU] General Data Protection Regulation.”

A customer alerted the CNIL in 2018 that he was able to access personal data of other customers, including their driver’s licenses, registration cards and bank identification records, from his personal account. The CNIL notified the company, which agreed to take corrective measures to protect its customers’ personal data.

The company informed the CNIL that measures had been taken. An on-site inspection was then carried out on the premises of the company. It has been found that:

  • the measures taken were not sufficient to prevent referencing;
  • the personal space login passwords, which the format was imposed by the company, corresponded to the date of birth of the customers, this format being also indicated on the login forms;
  • after the creation of their account, the username and the password of connection were transmitted to the customers by email and mentioned clearly in the body of the message.

On the basis of the investigations carried out, the restricted training – organ of the CNIL responsible for imposing sanctions – considered that the company had breached its obligation to secure personal data (RGPD).

Restricted training considered that:

  • the company should have ensured that every person wishing to access a document was entitled to consult it;
  • SEO by search engines could have been avoided using a file “robot.txt” for example;
  • the company should have required users to use stronger passwords and not transmit them in clear email.

The decision by the CNIL took into account the seriousness of the breach, because of the nature of the data and the documents in question (identity documents, information relating to infringements, bank details, etc.). It also took into account the number of people concerned, as the lack of security affected the accounts of several thousand customers and people who had terminated their contract with the company.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.