Loading

Blog details

ICO guidance for companies that develop or use generative AI systems

ICO guidance for companies that develop or use generative AI systems

The ICO has issued guidance for companies that develop or use generative artificial intelligence (AI) systems to ensure that they are complying with data protection laws.

 

The UK’s data protection regulator, the ICO has issued guidance for companies that develop or use generative artificial intelligence (AI) systems. Generative AI is a type of machine learning that creates new, unique content or designs. The purpose of this guidance is to assist developers and users in complying with data protection laws. The guidance consists of a series of questions that developers and users should ask themselves before implementing or using generative AI. The ICO’s guidance provides a helpful framework for companies utilizing or developing generative AI systems. By examining and answering the questions posed within this ICO guidance, companies can ensure that they are complying with data protection laws and safeguarding user data. It is important that companies take the time to consider these questions to promote transparency, accuracy, and accountability in the development and use of generative AI systems.

 

Companies using generative AI should adequately assess and develop a plan for handling associated risks. 

 

To properly develop and use generative AI,  companies need to consider various factors related to data protection. They must first identify a lawful basis for processing personal data, which could include consent or legitimate interests. As the data controller, companies using personal data have specific obligations that must be fulfilled. It is also necessary to assess and mitigate any risks related to data handling through a Data Protection Impact Assessment, and ensure transparency by making information about data processing publicly accessible.

 

As per the ICO guidance, steps must be taken by any company using generative AI to ensure protection of personal data, and the rights of individuals.

To mitigate security risks, developers must take steps to protect personal data from threats such as data leakage, model inversion and membership inference, as well as data poisoning. Collecting only relevant data necessitated by the intended purpose of the software is essential to limit unnecessary processing. Additionally, companies must put in place processes to comply with individuals’ right requests, which includes granting access to any personal data that has been processed, providing rectification or deletion of data, and allowing data portability. Finally, companies must consider whether they will be using generative AI to make automated decisions, which may require additional safeguards to ensure compliance with regulations.

 

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both Data Protection Officer outsourcing, and GDPR and Data Protection Act 2018 consultancy services, as well as Telecom Regulatory Consultancy. We can help your company get on track towards full compliance. Contact us today. 

Prev post
Temporary ban on chatGPT by the Italian DPA
april 4, 2023
Next post
Data breach notification guidelines from the EDPB
april 11, 2023