Artificial intelligence: checklist for public bodies in England

Published: 1 September 2022

Last updated: 1 September 2022

What countries does this apply to?

  • England
  • Scotland
  • Wales

This checklist is for public bodies in England (and non-devolved and cross-border public bodies).

It will help public bodies that do not have a specific duty to prepare or publish an equality impact assessment (EIA) to think about what they need to do to comply with the PSED.

It also expands on some core PSED themes and how they apply to using AI.

Checklist

You should develop a checklist to fit your own circumstances, but you may want to take the following steps.

  1. identify if and how you (or others on your behalf) use AI, and consider how the PSED applies
     
  2. collect equality evidence:
  • assess existing available information
  • look at external research
  • talk to staff, service users, equality groups and community groups
  • identify and address any data gaps
  1. review how the AI could affect people with different protected characteristics either positively or negatively
     
  2. assess the potential and actual impact by looking at the equality evidence and asking:
  • does the proposed or existing AI cause, or could it cause, discrimination?
  • does the proposed or existing AI help to eliminate discrimination?
  • does the proposed or existing AI contribute to advancing equality of opportunity?
  • does the proposed or existing AI affect good relations?
  1. use the results of the equality impact assessment when developing the new AI-related proposal or reviewing existing services (even if the AI was developed outside of your organisation):
  • do you need to make a big change or adjust the proposal or services?
  • can you continue or should you stop and remove the proposal or service?
  1. make sure you consider the results of the assessment carefully when making the final decision about the service or product and how it will be put in place
     
  2. keep records of decisions and how you considered the PSED (for example, minutes of meetings)
     
  3. publish the results of the assessment to support transparency
     
  4. train staff and make sure they understand their responsibilities
     
  5. continue to monitor the actual impact of the AI-related policy or service, reviewing and amending it as necessary

Think about equality from the start

You must build equality into your existing services and decisions about new policies or services. This includes any services that you commission others to deliver on your behalf. It must happen at the start of your decision-making on whether or not to use a new AI system.

Think about the potential equality benefits of using AI (for example, it may help to meet the needs of people from certain protected characteristic groups), the risks it may pose to equality (for example, it may put some groups at a particular disadvantage) and how you can reduce any risks.

You could gather evidence from research and engagement with other organisations that have experience of using similar products. You should also consider involving people from particular equality groups before and after implementation to help you assess the impact of proposed new or revised policies.

Keep a clear record of how you are considering the PSED

Case law on the PSED clarifies that it is good practice to keep a record of how you have considered equality throughout the decision-making process. Many organisations refer to this as an equality impact assessment (EIA).

If you are a public body in England, there is no legal obligation to carry out an EIA, but doing so can help you to show how you have considered the PSED. It is also important to keep a record of how decision-makers have considered the EIA or any other supporting equality information. Keeping these records can also help you to respond to complaints, audits and freedom of information requests.

Consider each protected characteristic

You should consider how the AI you use may affect people with different protected characteristics.

This does not always mean that you should give each protected characteristic the same level of consideration. Some service areas will be more relevant to some protected characteristics than others. You should be able to justify your decision to limit your analysis if you are challenged.

Having limited or no data on certain protected characteristics is not an excuse for not considering them. Where you have data gaps, you should take proportionate steps to fill these. For example, this might include undertaking targeted engagement or reviewing existing research.

It may be important to break your analysis down by more than one protected characteristic. This is often referred to as ‘intersectionality’. For example, if you are using facial recognition technology, is there any evidence that younger Black men are being treated differently to younger White men? You should consider how many people may be affected and the severity of the impact.

Consider the PSED when someone else is developing or providing the artificial intelligence

The PSED applies even if you are:

  • commissioning someone outside of your organisation to develop the AI for you
  • buying an existing product
  • commissioning a third party to use the AI on your behalf

Any external third party delivering or producing a service on your behalf may be subject to the PSED for that service. Where this is the case, it will be important to monitor compliance.

We recommend you make it a contractual requirement for the third party to comply with the PSED and provide the information you might need to monitor how the AI is working once it is implemented. For more information, read our procurement guidance for England.

Keep monitoring the artificial intelligence

The PSED is an ‘ongoing duty’.

This means you need to make sure AI is working as intended and guard against any unlawful discrimination or unintended negative effects that you might not have anticipated.

As a minimum, this is likely to include regular monitoring and evaluation to see whether people with one or more protected characteristic are being treated more or less well than other people. Sometimes this may be difficult. To support your analysis you might want to consider: 

  • engagement from staff and service users
  • complaints from service users
  • national research
  • court judgments
  • feedback from other organisations using similar technology

Page updates