I ba wledydd mae hyn yn berthnasol?
- Lloegr
- Alban
- Cymru
This checklist is for public bodies in England (and non-devolved and cross-border public bodies).
It will help public bodies that do not have a specific duty to prepare or publish an equality impact assessment (EIA) to think about what they need to do to comply with the PSED.
It also expands on some core PSED themes and how they apply to using AI.
Checklist
You should develop a checklist to fit your own circumstances, but you may want to take the following steps.
Think about equality from the start
You must build equality into your existing services and decisions about new policies or services. This includes any services that you commission others to deliver on your behalf. It must happen at the start of your decision-making on whether or not to use a new AI system.
Think about the potential equality benefits of using AI (for example, it may help to meet the needs of people from certain protected characteristic groups), the risks it may pose to equality (for example, it may put some groups at a particular disadvantage) and how you can reduce any risks.
You could gather evidence from research and engagement with other organisations that have experience of using similar products. You should also consider involving people from particular equality groups before and after implementation to help you assess the impact of proposed new or revised policies.
Keep a clear record of how you are considering the PSED
Case law on the PSED clarifies that it is good practice to keep a record of how you have considered equality throughout the decision-making process. Many organisations refer to this as an equality impact assessment (EIA).
If you are a public body in England, there is no legal obligation to carry out an EIA, but doing so can help you to show how you have considered the PSED. It is also important to keep a record of how decision-makers have considered the EIA or any other supporting equality information. Keeping these records can also help you to respond to complaints, audits and freedom of information requests.
Consider each protected characteristic
You should consider how the AI you use may affect people with different protected characteristics.
This does not always mean that you should give each protected characteristic the same level of consideration. Some service areas will be more relevant to some protected characteristics than others. You should be able to justify your decision to limit your analysis if you are challenged.
Having limited or no data on certain protected characteristics is not an excuse for not considering them. Where you have data gaps, you should take proportionate steps to fill these. For example, this might include undertaking targeted engagement or reviewing existing research.
It may be important to break your analysis down by more than one protected characteristic. This is often referred to as ‘intersectionality’. For example, if you are using facial recognition technology, is there any evidence that younger Black men are being treated differently to younger White men? You should consider how many people may be affected and the severity of the impact.
Consider the PSED when someone else is developing or providing the artificial intelligence
The PSED applies even if you are:
- commissioning someone outside of your organisation to develop the AI for you
- buying an existing product
- commissioning a third party to use the AI on your behalf
Any external third party delivering or producing a service on your behalf may be subject to the PSED for that service. Where this is the case, it will be important to monitor compliance.
We recommend you make it a contractual requirement for the third party to comply with the PSED and provide the information you might need to monitor how the AI is working once it is implemented. For more information, read our procurement guidance for England.
Keep monitoring the artificial intelligence
The PSED is an ‘ongoing duty’.
This means you need to make sure AI is working as intended and guard against any unlawful discrimination or unintended negative effects that you might not have anticipated.
As a minimum, this is likely to include regular monitoring and evaluation to see whether people with one or more protected characteristic are being treated more or less well than other people. Sometimes this may be difficult. To support your analysis you might want to consider:
- engagement from staff and service users
- complaints from service users
- national research
- court judgments
- feedback from other organisations using similar technology
Diweddariadau tudalennau
Cyhoeddwyd
1 Medi 2022
Diweddarwyd diwethaf
1 Medi 2022