Facebook case forwarded

German Facebook case forwarded to ECJ with questions pending

Facebook case forwarded to ECJ after Facebook appealed German competition authority’s order to halt data collection practices. 

 

In recent times, Facebook has come under fire for its data collection practices, which span several integrated platforms. The company has been accused of ‘superprofiling’, and has been in court with German authorities regarding a pro-privacy order, to stop combining user data across platforms without consent.  This order has been met with much resistance, and an appeal from Facebook has led German authorities to seek guidance from the European Court of Justice. 

 

Facebook was accused of abuse of power for collecting and sharing data across platforms without user consent. 

 

There has been major concern over Facebook sharing data between its platforms, including Instagram, WhatsApp, and Occulus as well as third party apps. This, coupled with the volume of data Facebook collects freely without the need for user consent has led to the tech giant being accused of abuse of power by German authorities. There has been some pushback on this, particularly from Düsseldorf’s Higher Regional Court Judge in preliminary hearings regarding the matter. Judge Jürgen Kühnen argued that Facebook’s data use did not result in an abuse of its dominant position in the market. The contention here is that Facebook’s ability to build a unique database for each individual gives the tech firm an unfair market advantage over other companies who do not have access to that much intricate data on users. The Bundeskartellamt (Federal Cartel Office, FCO) claims that this data collection is not lawful under the EU’s legal framework, as it essentially does not give users a choice. 

 

German Competition Authority has attempted to place restrictions on Facebook’s collection of user data. 

 

Earlier this year, Germany’s competition authority placed restrictions on Facebook’s data-processing activities. Facebook was ordered to stop combining data collected from WhatsApp, Instagram and other third parties, until they had received voluntary user consent. This would have led to Facebook needing to considerably reduce its collection and combining of user data, until it receives consent from users. Under Facebook’s terms and conditions, users operate on the social networking platform under the precondition that their data would be collected. However, in February of this year, the competition authority came to a preliminary decision regarding this practice and ordered Facebook to stop combining and collecting user data across these platforms until it has received genuine consent from users. This decision, however, was not final and left room for appeal from Facebook. 

 

Facebook appealed the decision, arguing that its terms allowed users to fully benefit from their services, and as a result this case has been forwarded to the ECJ. 

 

Facebook appealed the decision made by the German Competition Authority in February of this year. At the time, Facebook said in a blog; “While we’ve cooperated with the Bundeskartellamt for nearly three years and will continue our discussions, we disagree with their conclusions and intend to appeal so that people in Germany continue to benefit fully from all our services.” The German authority maintains that the social media company is guilty of a level of exploitative abuse which violates EU regulation. As a result, questions regarding this case have been forwarded to the European Court of Justice in order to arrive at a final conclusion. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy, GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides ePrivacy, GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, EU AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

EDPB published VVA guidelines

EDPB published VVA guidelines in the context of the GDPR

The EDPB published VVA guidelines giving context to the use of Virtual Voice Assistants in compliance with the GDPR. 

 

Recently, the EDPB published its guidelines for the use of virtual voice assistants. A virtual voice assistant (VVA) is a system that understands and executes voice commands and works with other IT systems if needed. It acts as an interface between users and their devices or online services like search engines. These services are very popular particularly with the integration of smart devices and smart homes. Due to the popularity of these devices in the home, in vehicles and even being worn by users, they are often given access to quite a bit of information on individuals, often of an intimate nature, which could threaten users’ rights to privacy. As a result VVAs have come under major scrutiny from several data protection authorities. The EDPB, by releasing these guidelines for the use of virtual voice assistants seeks to give guidance on the application of these systems in the context of the GDPR as well as other applicable legal frameworks. 

 

VVAs use machine learning methods which require the collections and interpretation of large amounts of voice data. 

 

Virtual voice assistants rely very heavily on machine learning methods in order to perform their wide range of tasks. For starters, these devices usually have a wake up command, for example either pushing a button or having a command word which wakes the device up, and puts it into active listening mode. VVAs typically depend on large data sets to be collected, selected, and labeled. Both quality and quantity of data in these scenarios are equally important and as a result, the VVA’s typically depend on snippets, which could give context to the use of the devices and service in real conditions. In some circumstances the VVA can capture audio of individuals who did not intend to use the VVA service in error. For example, in an instance where the wake up expression is accidentally detected, or the wake up expression has changed and the user has accidentally woken up the device by using the new wake up expression unbeknownst to them. For this reason, among several others, it is imperative that VVA services function in compliance with the GDPR particularly regarding the storage of data. 

 

The guidelines set out by the EDPB outline the legal framework for VVAs regarding not just the GDPR, but in some cases, the e-Privacy Directive. 

 

Because VVAs will undoubtedly process significant amounts of personal data, the relevant legal framework for VVAs is the GDPR. In addition to the GDPR, for all actors who require storage or access to information stored in the terminal equipment of a subscriber or user, the e-Privacy Directive sets a specific standard. The term “terminal equipment” refers to any smart phones, smart TVs, or any similar IoT devices. VVAs should also be considered as terminal devices when information in the VVA is stored or accessed. In all of those cases, the provisions for the e-Privacy Directive are applicable. The VVA guidelines published by the EDPB provide guidance on the identification of data processors and stakeholders, transparency, processing of children’s data, processing of special categories of data, as well as many other elements of data protection relating to VVAs. 

 

The EDPB published VVA guidelines, specifically outlining mechanisms for exercising Data Subject Rights. 

 

The EDPB has suggested several mechanisms for exercising data subject rights. These include the right to access, right to rectification, right to erasure, and the right to data portability. Data controllers must allow all users, whether registered or not, access to all of those rights. The data controllers must provide information on the data subjects’ rights, at best when a data subject turns on a VVA, or at the very latest when the first user voice request is processed. Since the main interaction intended for VVAs is using voice commands, and a portion of the VVA users are actually persons with disabilities requiring them to use voice assistance, VVA designers should ensure that users can exercise any of their data subject rights using easy to follow voice commands. The EDPB suggests implementing specific tools in the development of VVAs, providing efficient and effective ways to exercise data subjects rights. 

 

Do you provide VVA services or smart devices that use VVA services? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcing.  Contact us today.

New EU ePrivacy rules

New EU ePrivacy rules update

The ePrivacy rules governing electronic communication data will be updated as agreed upon by EU Member States. 

 

Earlier this month, EU member states agreed upon a negotiating mandate for revised ‘ePrivacy’ rules. The rules on the protection of privacy and confidentiality in the use of electronic communications define cases in which service providers are allowed to process data from electronic communications or access that which has been stored on an end user’s device. The last update to the ePrivacy directive was in 2009, and as such, the member states agree that this legislation needs to be brought up to date with new technological and market developments. The new ePrivacy Regulation will repeal the current ePrivacy Directive and is intended to complement and characterize the GDPR. This regulation will become effective 20 days after its publication in the EU Official Journal, and two years later, will start to apply. Details can be found in this press release by the European Council

 

The revised draft regulation will cover content from electronic communication over public services and networks, as well as related metadata. 

 

This draft ePrivacy regulation will repeal the existing directive and will cover content transmitted via public services and networks and related metadata, when end users are in the EU. Metadata refers to the information on the time, location and recipient of the communication for example. Metadata is considered to be potentially as sensitive as the actual content of electronic communication. The rules will also cover the handling of data transmitted from machine to machine via a public network. 

 

Any electronic communication data will be considered confidential, except when permitted by the ePrivacy regulation. 

 

As a general rule, all electronic communication is to be considered confidential, and should not be processed without the consent of the user. There are, however, a few exceptions specifically outlined in the ePrivacy regulation. These exceptions include any processing for the purposes of checking for malware and viruses as well as for ensuring the integrity of the communication service. Provisions are also made for cases where the service provider is required to do so by EU or member states’ law with regard to the prosecution of criminal offenses or the prevention of public security threats. 

 

Metadata may be processed for very specific purposes, and with strong additional safeguards applied to it. 

 

Metadata may be processed for example for billing purposes or for detecting and preventing fraud. If users give their consent, service providers may use metadata to display movements of traffic to help public authorities develop new infrastructure when needed. This processing is also allowed in instances where users’ vital interests need to be protected, for example the monitoring of epidemics or in emergencies like natural and man-made disasters. In specific cases network providers may process metadata for purposes other than that for which it was collected. In those cases, the intended purpose must be compatible with the initial purpose for the metadata and strong specific safeguards must be applied to the processing. 

 

It will be possible for users to whitelist service providers, giving consent to certain types of cookies, from certain websites via users’ browser settings. 

 

Users will be able to permit certain types of cookies from one or many service providers, and change those settings easily in their browser settings. This should make permissions for cookies easier and more seamless for users, alleviating cookie consent fatigue. In addition, end users will be able to genuinely choose whether to accept cookies or any similar identifier. It may be possible for service providers to make access to a webpage or website dependent on consent to the use of cookies for additional purposes, instead of using a paywall, however this will only be allowed if the user is able to access an equivalent offer by the same provider, that does not involve consenting to the use of cookies. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy rules, GDPR, and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

 

The next update to iOS

The next update to iOS could significantly impact targeted advertising on free apps.

The next update to iOS has created friction between Apple and advertising giants like Facebook which rely on targeted ads for revenue. 

 

The next update to iOS, initially announced last summer, will force app developers to explicitly seek permission to access the phone’s unique identifier known as the IDFA. This update is expected early in spring and is expected to significantly impact the effectiveness of targeted mobile ads. In order to tailor mobile ads to smartphone users, app developers and other industry players typically access this unique identifier on devices. However once this new rollout takes effect, a prompt will begin showing up for users, seeking their permission to give access to their IDFA. It is expected that roughly half of users may respond negatively or refuse access via this prompt. 

 

The effectiveness of targeted advertising relies heavily on access to personal identifiers like Apple users’ IDFA. 

 

Targeted advertising relies heavily on access to significant amounts of personal data, determining who is most likely to be affected by a particular message, and also how and when to deliver the message for maximum impact. For this reason, in order for targeted ads to be truly effective, access to data through Apple users’ IDFA is key and this update from Apple will no doubt, significantly impact targeted advertising.

 

Facebook argues that these changes will be of dire consequence to small businesses which depend on targeted advertising on free apps like theirs. 

 

One industry leader which generates much of its revenue through advertising has spoken up about the anticipated update. In a recent blog post, Facebook has expressed that they disagree with Apple’s approach, complaining that Apple provides no context on the benefits of having targeted ads, and suggesting that Apple’s new prompt implies that there is a trade off between personalized advertising and privacy. Facebook argues that the two are not mutually exclusive, and that they can and do provide both.

 

 Facebook urges that these changes will significantly impact the income of small business owners who rely on targeted ads via free apps to reach the customers most likely to convert into revenue for their businesses. Facebook intends to show Apple’s prompt asking for consent, but to also include their own prompt providing context on the benefits available to users through targeted advertising. 

 

Some industry leaders are opting to give up access to certain data, eliminating the need to seek consent. 

 

Google has also spoken up about the change and how they plan to navigate affairs taking this change into account. The company plans to cease from using any data that falls under Apple’s AppTrackingTransparency framework for iOS apps, which will exempt them from needing to show this prompt. Google is essentially forgoing access to a significant amount of personal data, to avoid needing to seek consent. 

 

How do data protection laws and this era of consent affect targeted advertising?

 

The GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”. The GDPR clearly states and requires that consent must be unambiguous and made by a statement of clear affirmative action.

 

Data protection laws like the GDPR and CCPA are designed to empower consumers, giving them more control over their personal information. The GDPR in particular operates by an “opt-in” model of consent, as clarified in its definition of the term, meaning that it cannot be assumed that a user has given their consent, simply by them not opting out. Users must clearly and unambiguously opt in and companies cannot assume that a user has given consent unless they have been asked, and in the right way, resulting in a clear affirmative response. From Apple’s perspective, this update does fall in line with the GDPR, seeking clear unambiguous consent from users to share a unique identifier such as their IDFA. “The philosophy behind it is similar to that of cookie consents for websites, only in the world of IoS apps,” comments Dr Bostjan Makarovic, Aphaia’s Managing Partner. However, there is no doubt that this update will affect the current model of advertising, and not just companies like Facebook which generate much of their income through their ability to provide targeted ads to users on their free platforms, but also much smaller businesses seeking their targeted advertising audience through the social network giant.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.