Loading

Blog details

Algorithmic transparency standard published by UK government

Algorithmic transparency standard published by UK government

Algorithmic transparency standard for government departments and public sector bodies published by the UK government.

 

 

A new standard for algorithmic transparency has recently been published by the UK government giving guidance to government departments and public sector bodies as part of the national data strategy and AI strategy. This initiative was launched recently by the Central Digital and Data Office (CDDO), strengthening the UK’s position as a global leader in trustworthy AI. The CDDO worked closely with the CDEI, or Centre for Data Ethics and Innovation in designing this standard. The CDEI, in its review into bias in algorithmic decision making, suggested that the UK government mandate a transparency obligation on public sector organisations which make use of algorithms for significant decision making which affects individuals.

 

There has been a call for more transparency around the use of AI systems both within the UK and internationally.

 

In addition to the call from CDEI for a mandated transparency obligation, civil society organisations like The Alan Turing Institute and the Ada Lovelace Institute, as well as international organisations like the Open Government Partnership and the Organisation for Economic Co-operation and Development (OECD) have all strongly supported this call for more transparency. The OECD Principles on AI state that there should be transparency and responsible disclosure regarding AI systems to ensure that people understand, and can challenge AI-based outcomes.

 

Lord Agnew, Minister of State at the Cabinet Office, said “Algorithms can be harnessed by public sector organisations to help them make fairer decisions, improve the efficiency of public services and lower the cost associated with delivery. However, they must be used in decision-making processes in a way that manages risks, upholds the highest standards of transparency and accountability, and builds clear evidence of impact.”

 

Several key stakeholders were consulted on developing a standard that will help teams be more meaningfully transparent.

 

In some cases, algorithmic tools are being used to make decisions that may have legal or economic repercussions on individuals. It is extremely important to have meaningful transparency regarding the way these algorithmic tools work in supporting those decisions, particularly if individuals are to be able to understand and challenge these decisions. As a result, having standards in transparency  In addition [a]to the CDDO and CDEI, several experts in civil society, academia, and the public were consulted on the development of this standard. Moving forward, the standard will be piloted by various government departments and public sector bodies. At the end of this piloting phase, the standard will be reviewed by the CDDO, based on feedback received. At that point, formal endorsement will be sought from the Data Standard Authority sometime in 2022.

 

 

This pioneering standard is one of several commitments made in the UK’s National AI Strategy.

 

The UK recently outlined its National AI Strategy and National Data Strategy, aimed at strengthening the country’s position as a world leader in AI governance. This move is one of several plans laid out in this strategy, and makes the UK one of the first countries in the world to develop a national algorithmic transparency standard. France and the Netherlands have both made some progress on developing their own national algorithmic transparency measures, while a few other cities, like Helinski and New York have begun to experiment with ideas to increase algorithmic transparency. In addition to this standard, the UK’s AI strategy, which we recently covered in our blog, includes several other initiatives which they plan to implement over time.

Do you use AI in your organisation and need help ensuring compliance with AI regulations? We can help you. Aphaia provides AI Ethics Assessments as well as GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact Assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Prev post
Claves de cifrado y privacidad: la AEPD explica cómo las claves pueden ser datos personales
December 7, 2021
Next post
El gobierno de Reino Unido publica un estándar de transparencia para algoritmos
December 9, 2021

Leave a Comment