Google announces an AI advisory board – only to dissolve it

Google creates advisory board to monitor the ethical use of AI

In line with the draft set of AI Ethics Guidelines produced by the European Commission’s High-Level Expert Group (AI HLEG) last December, Google and other Big Tech like Amazon and Microsoft are taking steps to adopt an ethical use of AI. Google, from their side, have created an external advisory board to monitor AI ethics in the company.

GDPR states that the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests in relation to the use of AI, which makes necessary unbiased algorithms and balanced training datasets. This is an example of privacy by design that requires a privacy expert to monitor the process from the very first stage of the project.

Google also announced their AI Principles last June, with the aim of assessing AI applications in view of seven main objectives: be socially beneficial, avoid creating or reinforcing unfair bias, be built and tested for safety, be accountable to people, incorporate privacy design principles, uphold high standards of scientific excellence and be made available for uses that accord with these principles.

Kent Walter, Senior Vice President of Global Affairs in Google, pointed out facial recognition and fairness in machine learning as some of the most relevant topics that will be addressed by the advisory board. The advisory board is comprised of international experts in the fields of technology, ethics, linguistics, philosophy, psychology and politics.

UPDATE: However, due to some of its members receiving wide criticism, Google has scrapped the initial board composition and gone back to the drawing board.

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

ICO Adtech Forum Overview

The Forum aimed to help ICO to better understand the privacy implications of analytics and digital tools used in the context of advertising

My best friend’s birthday was last week so I did some research in Google trying to find her favourite tennis shoes. I checked some sites to compare different models and prices and suddenly the tennis shoes where everywhere not only on my phone but also on my PC, which I did not even use to search the tennis shoes. Online newspaper articles, Instagram, travel agencies sites… Tennis shoes ads where displayed on every single webpage I visited. Can you relate? ICO Adtech Forum addressed this and other similar issues.

Adtech Fact Finding Forum was held on 6thMarch 2018 and was focused on transparency, lawful basis and security. It brought together more than a hundred of people related to adtech somehow: publishers, advertisers, civil society, start-ups, adtech firms and lawyers, who mainly discussed how people’s personal data is used in real-time bidding (RTB) in programmatic advertising.

RTB is an approach for online advertising buying and selling. Once an advertiser’s bid wins the auction, their digital real time personalised ad is instantaneously shown on the publisher’s site. This is why I bumped into tennis ads again and again.

The challenge is that this type of technology is based on the use of personal data, massive personal data normally, so how is this tackled in terms of information, lawful basis and data disclosing? Does it meet GDPR requirements? This issue raises bigger concerns as some publishers felt that participating fully in RTB was the only commercially viable option available to them currently. This situation results in a trade-off where privacy risk could be reduced by reducing the number of third-party actors involved but at the same time revenue can only be maximised by creating competition for the publisher’s advertising space. Additionally, increasing the requirements for RTB by making mandatory providing the users with a visible option for setting their preferences might imply the end of RTB as most of them would not consent.

Some of the initial conclusions stood up for increasing the control over the third-parties the data is shared with, removing or truncating some of the information in the bid request, urging unsuccessful bidders to not retain any information and making clearer the distinction between controllers and processers in the RTB ecosystem, among others. One should further highlight the likelihood of the data controller having to undertake a Data Protection Impact Assessment for many instances of RTB.

The ICO offered a window for additional written submissions to be provided following the event, limited to 1,000 words, by the end of next week (just email them toevents@ico.org.uk).

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Aphaia at T3chfest 2019

T3chfest is one of the most important tech events in Spain and Aphaia was delighted to attend

T3chfest 2019 took place last week at University Carlos III of Madrid. It is a programming and technology event that has been held annually since 2013 and it is addressed to students, companies and technology enthusiasts. All of them could enjoy multiple talks, workshops and activities around technology during the two days event.

A total of 82 talks were held on five different stages and were given by speakers coming from the EU and American countries. Big companies like Microsoft, Airbus and Accenture attended the event with the aim of meeting students that could become potential candidates.

It was a pleasure for us attending T3chfest 2019 and enjoying some talks that discussed the latest and big tech innovations of the last years. We are proud to note that some of those technologies are already in place in some of our clients’ businesses.

The most demanded topics were bioprinting, videogames, blockchain, cyber-security, artificial intelligence, apps, and UX.

As a new technology consultancy, Aphaia provides some AI-focused services, as GDPR risk analysis for decision-making algorithms and Ethics assessments, so AI-addressed talks were particularly relevant for us. Generative Adversarial Networks (GANs) are one of the most interesting and latest applications of AI.

GANs are deep neural network architectures comprised of two neural networks, fighting one against the other. The generative network generates candidates based on the training dataset while the discriminative network evaluates them according to the same dataset, with the aim of deciding if it is or not a fake image.

GANs might result in high risk for cybersecurity, as they can be spitefully used for fake news, phishing and data theft. As the generative network becomes better as it is trained against the discriminative network, at the end of the day it is almost impossible to differentiate between a real and a fake image or text. GANs could be used, for example, to generate fake rental articles (thisrentaldoesnotexist) and insert malicious links to collect data unlawfully.

In the same note, as privacy experts, we were especially interested in those talks who not only considered technology but also privacy around technology. One of the talks addressed how to protect privacy in open-source projects.

Open-source software is open to anyone by design, whether it is a community of developers, hackers or malicious users. Authors typically hide their identity through nicknames and avatars, however, they have no protection against authorship attribution techniques, which can find stylistic fingerprints in the source code and are used to persuade and threaten the authors.

It is important for authors to be aware of the privacy risks of the open source projects and apply security measures from the very first time they contribute, like hiding the coding style.

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

Overview on Device Fingerprinting

Spanish Data Protection Supervisory Authority, AEPD, published guidelines on device fingerprinting.

What is fingerprinting?

Beyond cookies or pixels, there are other techniques of identification and monitoring on the Internet that allow the realization of profiles and potential profitability of the data associated with them. This category displays the so-called fingerprinting of the device, which is defined as a systematic compilation of information about a particular remote computer in order to identify it and singularize it. While it can be done for a legitimate purpose such as enabling multiple-authentication mechanisms, it can also be used for tracking and profiling, with the ultimate goal of exploiting such data, although initially the information is collected with a technical purpose.

How is Privacy affected by fingerprinting?

Given that people usually tend not to share their devices, singling out a device involves in general terms the identification of an individual, which points out the need of applying Data Protection rules. An additional concern in this regard comes from the possibility to reassign the linked information to the user even when cookies have been deleted, so fingerprinting prevents the loss of the traceability of the user’s browser habits or indeed It can be used for the tracking as such, which increases the risk for the rights and freedoms of the individuals, who most times are not even aware of the tracking.

Click the picture to make it bigger
How can an individual be identified using fingerprinting?

There are three main elements which allow the identification of a singular device, thus its user by means of fingerprinting:

  • Gathering a set of massive discrimination data.
  • Global nature of the internet.
  • Unique ID.

Can users block the fingerprinting?

Even if there is no current option to completely block fingerprinting, most browsers allow users to set up their privacy preferences. World Wide Web Consortium (W3C) proposed a mechanism called Do Not Track (DNT) which gives the user an option to disable web tracking on the device. W3C claims that DNT should be opted-in by default without requiring any positive action from the user.

That said, websites should check this parameter through javascript function calls to the user’s device, in order to allow the controller to know the user’s preferences, thus their options for processing such data. However, an analysis carried out by the AEPD showed that only 16,72% of sites check DNT before processing their users’ information, and most of the cases where the DNT option is activated, the sites kept collecting the fingerprint, ignoring the user’s wishes. Furthermore, those programs even use the DNT request itself as an additional unique identification factor.

Other alternatives to protect Privacy on the internet:

  • Installation of blockers.
  • Disabling use of Javascript.
  • Alternating different browsers.
  • Execution of access to internet in virtual machines.
  • Limiting the installation of browser extensions.

Privacy and data protection requirements for the industry

For manufacturers and developers:

  • Products with privacy settings.
  • Maximum level of data protection by default.

Controllers that use fingerprinting:

  • Checking DNT preferences before processing any data.
  • Gathering users’ consent (even where DNT is disabled).
  • Including fingerprinting in the record of processing activities.
  • Data Protection Officer Advice and overview.
  • Risks analysis and Data Protection Impact Assessment where relevant, considering:
  • The impact of the disclosure of profiling information contained in the database.
  • In relation to the above, access to said information by governmental or political organisations.
  • The use of social, cultural or racial bias leading to automatic decisions.
  • Access by employees or third parties to specific users’ data.
  • The use of the data to social, political or general harassment.
  • The excessive collection of data and their retention for excessive periods.
  • The impact on the perception of the freedom of use of profiling information.
  • The manipulation of user’s wishes, beliefs and emotional state.
  • In relation to the above, the risk of re-identification.

Fingerprint risks are covered by GDPR Article 30, which generically refers to online identifiers, which means data protection rules directly apply to fingerprint.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.