French DPA provides GDPR recommendations

French DPA provides GDPR recommendations regarding chatbots

CNIL of France has provided GDPR recommendations regarding chatbots and insights on the implications of their use. 

 

Chatbots are a fairly common feature on websites today, providing users with an experience of having frequently asked questions answered quickly and easily, and providing other useful information in an interactive way. Personal data is typically processed during this process and as such, it is important that data controllers and processors remain mindful of any issues relating to the rights and freedoms of individuals during this process. If available, a Data Protection Officer would be helpful in this regard, as there are cases where Data Protection Impact Assessments are recommended or necessary. 

 

Chatbots require cookie placing and must remain within regulation. 

 

Chatbots save the conversation history between the different website pages where it is present, and in order for that to be successfully executed, cookies are frequently placed on user devices. This must be done in accordance with data protection laws. The CNIL has published recommendations regarding chatbots, and navigating the use of cookies in accordance with the Data Protection Act, particularly article 82, which provides guidance on the use of cookies. 

 

Two ways to place cookies. 

 

Because the presence or use of a chatbot requires the deposit of cookies onto a user’s computer, permissions may be required in order to do so. There are two available options for the chatbot operator. The first option would be to obtain prior consent from the user in order to deposit the cookie. This consent must be free, specific, informed and unambiguous. The second option would be to place the cookie only when the user activates the chatbot. This would involve the user clicking a button specifically triggering the opening of the chatbot. In this case it does not require specifically obtaining consent of the user, as the cookies would be specifically for the purpose of the provision of the chatbot service. However, if the tracker used for the chatbot is attached to any other purpose apart from that chatbot, user consent would be required. The data collected by this tracker must only be stored for as long as is necessary to achieve the purpose of the processing. 

 

French DPA recommendations on the collection of special categories of data by a chatbot. 

 

The CNIL advises that special attention should be paid when collecting data of a special category. This may include information relating to health, religious affiliation, political opinions etc. In some cases the collection of this information is predictable and therefore the processing is relevant. For example a chatbot for a health related assistance service may collect and process relevant health data. In those cases it is necessary to ensure that the data processing is in accordance with Article 9.2 of the GDPR. The processing of special categories of data is one of nine criteria which can make a Data Protection Impact Assessment necessary. In the case where more than one of these criteria is met, a Data Protection Impact Assessment may become mandatory. “This might be the case where minor’s data is involved or where the data gathered by the chatbot is combined, compared or matched with data from other sources”, comments Cristina Contero Almagro, Partner in Aphaia .

 

In some cases the collection of such sensitive data is not predictable as chatbots often offer the option to freely write or type, and the data controller or subcontractor may not have anticipated sensitive data being provided by a user. In those cases prior consent is not required. However, mechanisms must be put in place to minimize the risks to the rights and freedoms of individuals. This can be done by communicating before or when the chatbot is launched, urging people to refrain from communicating special categories of data. In addition a purge system can be set up since the conservation of the sensitive data is not necessary.

 

Conversations with a chatbot may not be used for decision making affecting an individual.

 

Regardless of the nature of the conversation with a chatbot human intervention is required to lead to important decisions affecting an individual. A conversation with a chatbot, without any human intervention alone cannot lead to important decisions for the person concerned. This includes the refusal of an online credit application, the application of higher rates or the inability to submit an application for a position. Conversations with chatbots, however, may form part of a larger process that would include meaningful human interaction.

 

Article 22 of the GDPR prohibits automated decision-making where there are legal ramifications significantly affecting an individual. Exceptions include  cases where the person has given expressed consent, as well as when decision making is necessary for a contract between the user and the controller. A data subject must in either case be provided with the means to obtain a human intervention, which a chatbot alone cannot provide.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

New EU ePrivacy rules

New EU ePrivacy rules update

The ePrivacy rules governing electronic communication data will be updated as agreed upon by EU Member States. 

 

Earlier this month, EU member states agreed upon a negotiating mandate for revised ‘ePrivacy’ rules. The rules on the protection of privacy and confidentiality in the use of electronic communications define cases in which service providers are allowed to process data from electronic communications or access that which has been stored on an end user’s device. The last update to the ePrivacy directive was in 2009, and as such, the member states agree that this legislation needs to be brought up to date with new technological and market developments. The new ePrivacy Regulation will repeal the current ePrivacy Directive and is intended to complement and characterize the GDPR. This regulation will become effective 20 days after its publication in the EU Official Journal, and two years later, will start to apply. Details can be found in this press release by the European Council

 

The revised draft regulation will cover content from electronic communication over public services and networks, as well as related metadata. 

 

This draft ePrivacy regulation will repeal the existing directive and will cover content transmitted via public services and networks and related metadata, when end users are in the EU. Metadata refers to the information on the time, location and recipient of the communication for example. Metadata is considered to be potentially as sensitive as the actual content of electronic communication. The rules will also cover the handling of data transmitted from machine to machine via a public network. 

 

Any electronic communication data will be considered confidential, except when permitted by the ePrivacy regulation. 

 

As a general rule, all electronic communication is to be considered confidential, and should not be processed without the consent of the user. There are, however, a few exceptions specifically outlined in the ePrivacy regulation. These exceptions include any processing for the purposes of checking for malware and viruses as well as for ensuring the integrity of the communication service. Provisions are also made for cases where the service provider is required to do so by EU or member states’ law with regard to the prosecution of criminal offenses or the prevention of public security threats. 

 

Metadata may be processed for very specific purposes, and with strong additional safeguards applied to it. 

 

Metadata may be processed for example for billing purposes or for detecting and preventing fraud. If users give their consent, service providers may use metadata to display movements of traffic to help public authorities develop new infrastructure when needed. This processing is also allowed in instances where users’ vital interests need to be protected, for example the monitoring of epidemics or in emergencies like natural and man-made disasters. In specific cases network providers may process metadata for purposes other than that for which it was collected. In those cases, the intended purpose must be compatible with the initial purpose for the metadata and strong specific safeguards must be applied to the processing. 

 

It will be possible for users to whitelist service providers, giving consent to certain types of cookies, from certain websites via users’ browser settings. 

 

Users will be able to permit certain types of cookies from one or many service providers, and change those settings easily in their browser settings. This should make permissions for cookies easier and more seamless for users, alleviating cookie consent fatigue. In addition, end users will be able to genuinely choose whether to accept cookies or any similar identifier. It may be possible for service providers to make access to a webpage or website dependent on consent to the use of cookies for additional purposes, instead of using a paywall, however this will only be allowed if the user is able to access an equivalent offer by the same provider, that does not involve consenting to the use of cookies. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the ePrivacy rules, GDPR, and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

 

Spanish DPA launched Pacto Digital

Spanish DPA launched Pacto Digital, a digital pact for data protection

The Spanish DPA launched Pacto Digital, a digital pact for data protection with the support of over 40 organizations. 

 

The Pacto Digital initiative by the AEPD was officially presented to the public on January 28th, Data Protection Day at a virtual event called “The Forum on Privacy, Innovation and Sustainability. This event was streamed live, with several state, business and media officials in attendance. This initiative is part of the Spanish DPA’s Social Responsibility and Sustainability Framework with the aim of raising awareness, making data protection compatible with innovation and fostering a commitment to privacy among organizations. The principles of this pact promote transparency, giving citizens a greater awareness of what data is being collected and why. The initiative also promotes gender and race equality and ensures the protection of children and other vulnerable persons. It promotes and supports innovation by ensuring that technological advancements avoid perpetuating biases, particularly based on race, origin, belief, religion and gender. 

 

The digital pact initiative launched by the AEPD consists of three documents; a contract, a digital responsibility pledge and a code of conduct. 

 

Organisations which subscribe to this digital pact, would all sign a contract, showing their commitment to implementing the recommendations of the pact within their organisation. In addition, these organisations will commit to giving their employees and users access to the Priority Channel to request the urgent removal of sexual or violent content online, as well as other key tools and resources to help raise awareness on the importance of privacy and personal data. 

 

As part of this initiative, the Spanish DPA has also introduced a Digital Responsibility Pledge containing obligations which the organisations pledge to keep. This is not intended to give subscribing organisations additional responsibilities outside of the legislature to which they’re already held. This pledge is simply tailored to the digital environment and geared towards getting a specific commitment from these organisations to uphold the standard. It outlines the already existing responsibilities of organisations specifically geared towards safety and privacy online. It also incorporates principles that should be considered to ensure that the ethics of data protection remain intact when designing and implementing new technological developments. 

 

Finally, the code of conduct for good privacy practices is geared towards organisations with their own dissemination channels  and the media, both of which the AEPD intends to collaborate with to report issues of relevance to their networks and audiences. In addition, the code of conduct states that these organisations commit to refraining from identifying victims of the dissemination of sensitive content or punishing any information which could possibly identify them, particularly regarding public figures. 

 

Forty organisations signed the pact on January 28th, however other interested parties may apply online. 

 

On January 28th, the 40 organisations who already form part of this pact, made their commitment to the principles of data protection and privacy publicly known by signing this agreement. This digital pact is open to any organisations that wish to assume those commitments reflected in the contract. Interested organisations may apply online showing their commitment publicly, and promising to commit to the principles outlined in this pact. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

The next update to iOS

The next update to iOS could significantly impact targeted advertising on free apps.

The next update to iOS has created friction between Apple and advertising giants like Facebook which rely on targeted ads for revenue. 

 

The next update to iOS, initially announced last summer, will force app developers to explicitly seek permission to access the phone’s unique identifier known as the IDFA. This update is expected early in spring and is expected to significantly impact the effectiveness of targeted mobile ads. In order to tailor mobile ads to smartphone users, app developers and other industry players typically access this unique identifier on devices. However once this new rollout takes effect, a prompt will begin showing up for users, seeking their permission to give access to their IDFA. It is expected that roughly half of users may respond negatively or refuse access via this prompt. 

 

The effectiveness of targeted advertising relies heavily on access to personal identifiers like Apple users’ IDFA. 

 

Targeted advertising relies heavily on access to significant amounts of personal data, determining who is most likely to be affected by a particular message, and also how and when to deliver the message for maximum impact. For this reason, in order for targeted ads to be truly effective, access to data through Apple users’ IDFA is key and this update from Apple will no doubt, significantly impact targeted advertising.

 

Facebook argues that these changes will be of dire consequence to small businesses which depend on targeted advertising on free apps like theirs. 

 

One industry leader which generates much of its revenue through advertising has spoken up about the anticipated update. In a recent blog post, Facebook has expressed that they disagree with Apple’s approach, complaining that Apple provides no context on the benefits of having targeted ads, and suggesting that Apple’s new prompt implies that there is a trade off between personalized advertising and privacy. Facebook argues that the two are not mutually exclusive, and that they can and do provide both.

 

 Facebook urges that these changes will significantly impact the income of small business owners who rely on targeted ads via free apps to reach the customers most likely to convert into revenue for their businesses. Facebook intends to show Apple’s prompt asking for consent, but to also include their own prompt providing context on the benefits available to users through targeted advertising. 

 

Some industry leaders are opting to give up access to certain data, eliminating the need to seek consent. 

 

Google has also spoken up about the change and how they plan to navigate affairs taking this change into account. The company plans to cease from using any data that falls under Apple’s AppTrackingTransparency framework for iOS apps, which will exempt them from needing to show this prompt. Google is essentially forgoing access to a significant amount of personal data, to avoid needing to seek consent. 

 

How do data protection laws and this era of consent affect targeted advertising?

 

The GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”. The GDPR clearly states and requires that consent must be unambiguous and made by a statement of clear affirmative action.

 

Data protection laws like the GDPR and CCPA are designed to empower consumers, giving them more control over their personal information. The GDPR in particular operates by an “opt-in” model of consent, as clarified in its definition of the term, meaning that it cannot be assumed that a user has given their consent, simply by them not opting out. Users must clearly and unambiguously opt in and companies cannot assume that a user has given consent unless they have been asked, and in the right way, resulting in a clear affirmative response. From Apple’s perspective, this update does fall in line with the GDPR, seeking clear unambiguous consent from users to share a unique identifier such as their IDFA. “The philosophy behind it is similar to that of cookie consents for websites, only in the world of IoS apps,” comments Dr Bostjan Makarovic, Aphaia’s Managing Partner. However, there is no doubt that this update will affect the current model of advertising, and not just companies like Facebook which generate much of their income through their ability to provide targeted ads to users on their free platforms, but also much smaller businesses seeking their targeted advertising audience through the social network giant.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.