CNIL of France has provided GDPR recommendations regarding chatbots and insights on the implications of their use.
Chatbots are a fairly common feature on websites today, providing users with an experience of having frequently asked questions answered quickly and easily, and providing other useful information in an interactive way. Personal data is typically processed during this process and as such, it is important that data controllers and processors remain mindful of any issues relating to the rights and freedoms of individuals during this process. If available, a Data Protection Officer would be helpful in this regard, as there are cases where Data Protection Impact Assessments are recommended or necessary.
Chatbots require cookie placing and must remain within regulation.
Chatbots save the conversation history between the different website pages where it is present, and in order for that to be successfully executed, cookies are frequently placed on user devices. This must be done in accordance with data protection laws. The CNIL has published recommendations regarding chatbots, and navigating the use of cookies in accordance with the Data Protection Act, particularly article 82, which provides guidance on the use of cookies.
Two ways to place cookies.
Because the presence or use of a chatbot requires the deposit of cookies onto a user’s computer, permissions may be required in order to do so. There are two available options for the chatbot operator. The first option would be to obtain prior consent from the user in order to deposit the cookie. This consent must be free, specific, informed and unambiguous. The second option would be to place the cookie only when the user activates the chatbot. This would involve the user clicking a button specifically triggering the opening of the chatbot. In this case it does not require specifically obtaining consent of the user, as the cookies would be specifically for the purpose of the provision of the chatbot service. However, if the tracker used for the chatbot is attached to any other purpose apart from that chatbot, user consent would be required. The data collected by this tracker must only be stored for as long as is necessary to achieve the purpose of the processing.
French DPA recommendations on the collection of special categories of data by a chatbot.
The CNIL advises that special attention should be paid when collecting data of a special category. This may include information relating to health, religious affiliation, political opinions etc. In some cases the collection of this information is predictable and therefore the processing is relevant. For example a chatbot for a health related assistance service may collect and process relevant health data. In those cases it is necessary to ensure that the data processing is in accordance with Article 9.2 of the GDPR. The processing of special categories of data is one of nine criteria which can make a Data Protection Impact Assessment necessary. In the case where more than one of these criteria is met, a Data Protection Impact Assessment may become mandatory. “This might be the case where minor’s data is involved or where the data gathered by the chatbot is combined, compared or matched with data from other sources”, comments Cristina Contero Almagro, Partner in Aphaia .
In some cases the collection of such sensitive data is not predictable as chatbots often offer the option to freely write or type, and the data controller or subcontractor may not have anticipated sensitive data being provided by a user. In those cases prior consent is not required. However, mechanisms must be put in place to minimize the risks to the rights and freedoms of individuals. This can be done by communicating before or when the chatbot is launched, urging people to refrain from communicating special categories of data. In addition a purge system can be set up since the conservation of the sensitive data is not necessary.
Conversations with a chatbot may not be used for decision making affecting an individual.
Regardless of the nature of the conversation with a chatbot human intervention is required to lead to important decisions affecting an individual. A conversation with a chatbot, without any human intervention alone cannot lead to important decisions for the person concerned. This includes the refusal of an online credit application, the application of higher rates or the inability to submit an application for a position. Conversations with chatbots, however, may form part of a larger process that would include meaningful human interaction.
Article 22 of the GDPR prohibits automated decision-making where there are legal ramifications significantly affecting an individual. Exceptions include cases where the person has given expressed consent, as well as when decision making is necessary for a contract between the user and the controller. A data subject must in either case be provided with the means to obtain a human intervention, which a chatbot alone cannot provide.