New cookie consent popup launched by Google following CNIL fine

Google is rolling out a new cookie consent pop up, after receiving a fine from the CNIL under the EU GDPR.

 

Google recently shared a preview of its new cookie consent popup. This new popup will initially be available on YouTube in France. However Google has expressed that it plans to roll out the new design across all Google services in Europe. This new cookie consent popup comes a few months after the CNIL of France fined Google €150 million for breaching data protection law. According to CNIL, Google failed to comply with current regulation with regard to presenting tracking choices to users with the previous cookie consent popup. Not only has the text been updated, but more importantly, the choices offered at the bottom of the cookie consent popup are very different.

 

Google made some drastic changes to the choices offered at the bottom of the new cookie consent pop up.

 

The choices at the bottom of the screen, as will be reflected in the new cookie consent popup, are radically different. With the old design, users had two options — “I Agree” and “Customize”. With the old popup, users who clicked on “Customize”, would be taken to a separate web page with several options. In order to disable all personalization settings, they would have to click “off” three times and then click confirm. In the new design, there is now a third option, a “Deny All” button that lets users opt out of tracking altogether with a single click, with the two main buttons being the same color, size and shape. Under the EU GDPR and the ePrivacy rules, online services have to obtain clear consent from their users before they can process not-strictly necessary cookies data. Consent must be informed, specific and freely given in order for it to be legally obtained. The new approach will allow Google to get more meaningful consent from users.

 

Inspired by guidance from the CNIL, under the EU GDPR, Google has overhauled its approach to managing cookies.

 

After the initial roll out of the updated popup on YouTube in France, Google plans to use the same design for its search engine as well across the European Economic Area, the U.K. and Switzerland. Many users won’t see the updated popup. Users who are already logged into a Google account have settings that are already stored in their profiles. Also, people who are using Google Chrome more than likely have their web browser tied to their Google accounts if they have ever logged into a Google service in the past. New users will soon experience more options with the new cookie consent popup. Existing users can however review their privacy settings. “Following conversations and in accordance with specific directives from the Commission nationale de l’informatique et des libertés (CNIL), we carried out a complete overhaul of our approach. In particular, we have changed the infrastructure we use to manage cookies,” Google wrote in a recent blog.

Does your company want to collect cookies through a website or app? Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

emergency measures for children’s protection

EU approves emergency measures for children’s protection

Temporary emergency measures for children’s protection have just been adopted by European Parliament.

 

Temporary emergency measures for children’s protection were adopted by European Parliament on July 6th. This regulation will allow electronic communication service providers to scan private online messages containing any display of child sex abuse. The European Commission reported that almost 4 million visual media files containing child abuse were reported last year. There were also 1,500 reports of grooming of minors by sexual predators. Over the past 15 years, reports of this kind have increased by 15,000%. 

 

This new regulation, which is intended to be executed using AI, has raised some questions regarding privacy. 

 

Electronic communication service providers are being given the green light to voluntarily scan private conversations and flag content which may contain any display of child sex abuse. This scanning procedure will detect content for flagging using AI, under human supervision. They will also be able to utilize anti-grooming technologies once consultations with data protection authorities are complete. These mechanisms have received some pushback due to privacy concerns. Last year, the EDPB published a non-binding opinion which questioned whether these measures would threaten the fundamental right to privacy. 

 

Critics argue that this law will not prevent child abuse but will rather make it more difficult to detect and potentially expose legitimate communication between adults. 

 

This controversial legislation drafted in September 2020, at the peak of the global pandemic, which saw a spike in reports of minors being targeted by predators online, enables companies to voluntarily monitor material related to child sexual abuse. However, it does not require companies to take action. Still, several privacy concerns were raised regarding its implementation, particularly around exposing legitimate conversation between adults which may contain nude material, violating their privacy and potentially opening them up to some form of abuse. During the negotiations, changes were made to include the need to inform users of the possibility of scanning their communications, as well as dictating data retention periods and limitations on the execution of this technology. Despite this, the initiative was criticized, citing that automated tools often flag non relevant material in the majority of cases. Concerns were raised about the possible effect this may have on channels for confidential counseling. Ultimately, critics believe that this will not prevent child abuse, but will rather make it harder to discover it, as it would encourage more hidden tactics. 

 

This new EU law for children’s protection is a temporary solution for dealing with the ongoing problem of child sexual abuse. 

 

From the start of 2021, the definition of electronic communications has been changed under EU law to include messaging services. As a result private messaging, which was previously regulated by the GDPR, is now regulated by the ePrivacy directive. Unlike the GDPR, the ePrivacy directive did not include measures to detect child sexual abuse. As a result, voluntary reporting by online providers fell dramatically with the aforementioned change. Negotiations have stalled for several years on revising the ePrivacy directive to include protection against child sexual abuse. This new EU law for children’s protection is but a temporary measure, intended to last until December 2025, or until the revised ePrivacy directive enters into force. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.