Guidance
The Public Sector Equality Duty and data protection
Published: 12 September 2024
Last updated: 12 September 2024
What countries does this apply to?
- England
- Scotland
- Wales
Introduction
This guidance explains the relationship between the Public Sector Equality Duty (PSED) and data protection law. Data protection law includes:
- the UK General Data Protection Regulation (UK GDPR)
- the Data Protection Act 2018
It provides advice for public authorities in England, Scotland and Wales that are legally required to publish equality information under the specific equality duties.
It will also be helpful for authorities when they collect and use data about people sharing particular protected characteristics. This is sometimes called equality monitoring. It will help them to build an evidence base to support compliance with the PSED and to meet their specific duties.
It supplements a range of guidance we have published about the PSED.
This can be read alongside our:
- technical guidance on the PSED in England
- technical guidance on the PSED in Scotland
- technical guidance on the PSED in Wales
This guidance explores the intersection of data protection obligations and the PSED. Where relevant, it signposts guidance on data protection law provided by the Information Commissioner’s Office (ICO). Nevertheless, to ensure compliance with data protection law, public authorities should also consult the official ICO guidance, including on artificial intelligence (AI).
If you work for a law enforcement or intelligence agency, consult the ICO website to find out about the specific requirements you may be subject to when processing data under Part 3 and Part 4 of the Data Protection Act 2018.
The Public Sector Equality Duty
The PSED consists of a general duty and specific duties.
The general duty is set out in Section 149 of the Equality Act 2010. It applies to public authorities and other organisations when they are carrying out public functions across Britain.
The general duty covers the following protected characteristics:
- age
- disability
- gender reassignment
- pregnancy and maternity
- race
- religion or belief
- sex
- sexual orientation
It also covers marriage and civil partnership with regard to discrimination in the workplace.
In summary, authorities subject to the general duty must, in carrying out their functions, have due regard to the need to:
- eliminate unlawful discrimination, harassment and victimisation and other conduct prohibited by the Equality Act 2010
- advance equality of opportunity between people who share a protected characteristic and those who do not
- foster good relations between people who share a protected characteristic and those who do not
These are often referred to as the three aims or needs of the general duty.
What the general duty requires in relation to information
There is no explicit legal requirement under the general duty to collect and use equality information. However, to have due regard to the aims or needs of the general duty, public authorities must understand how their policies and practices affect those with particular protected characteristics. This is often referred to as assessing the equality impact of policies and decisions.
Collecting and analysing equality information on people with protected characteristics (including information from engagement, where relevant) and outcomes can be an important way for authorities to develop this understanding.
Public authorities should take a proportionate approach. They should always consider whether the same results could be achieved with fewer risks to people’s rights and freedoms, in particular, their privacy. They should also collect the minimum data required to achieve their objective as explained in the ICO guide to the data protection principles Principle (c) : Data minimisation.is collected, public authorities should refer to best practice and standards for categorising information (for example, standards for collecting ethnicity data) to ensure that the information is relevant.
Data Protection
Data protection law is based on seven principles of good information handling as outlined in Article 5 of the UK GDPR (as retained EU law). It also gives people specific rights in relation to their personal information and puts certain obligations on organisations that are responsible for processing it. For more details on this, read the guide to the data protection principles published by the ICO.
Personal data means any information relating to an identified or identifiable person (‘data subject’). An identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifier (such as a name, an identification number, location data or an online identifier) or one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person. For more details on this, read the ICO guidance Personal information – what is it?.
Collecting personal information for public sector equality duty purposes
Collecting equality information gives public authorities an understanding of the impact of their policies and practices on people who share particular protected characteristics. However, public authorities must make sure that any personal information they collect is necessary to meet their obligations under the general duty. They should also be clear about how the information will be used.
In addition to equality information collected by the public authority itself, other sources of information may be relevant to understanding the impact of its functions on people with particular protected characteristics. Examples of these include:
- national studies
- sector reports
- and reports published by organisations such as the Equality and Human Rights Commission, which offer expert advice and guidance
Read our PSED guidance for further information.
Processing equality information lawfully under equality and data protection law
Data protection law does not prevent public authorities from processing personal data that is directly or indirectly linked to protected characteristics under the Equality Act 2010. In fact, processing information on the protected characteristics of service users and employees is key for public authorities to:
- understand and respond to the needs of their communities and workforce
- uncover and address discrimination, biases and inequalities.
It also gives public authorities an understanding of the impact of their policies and practices on people who share particular protected characteristics.
However, public authorities must always ensure that the processing of personal data is lawful, fair and transparent, including when data is used in the context of AI. It must comply with the principles and requirements of UK GDPR, as well as the PSED general duty, as explained below.
What kind of data may public authorities or their contractors process under the UK GDPR?
Public authorities or the organisations they contract to deliver a public function may process personal data and special category data.
Personal data is ‘any information relating to an identified or identifiable natural person (‘data subject’). An identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.’ For more details, read the ICO’s What is personal data? guidance.
Special category data is a subcategory of personal data that needs more protection because it is sensitive. It includes data that can reveal racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data (where used for identification purposes) and data concerning health, a person’s sex life and sexual orientation. Data concerning a person’s transgender status also falls within this definition. For more details, read the ICO’s Special category data guidance.
All protected characteristics under the Equality Act 2010 are personal data. However, not all protected characteristics are explicitly listed as special category data. In particular:
- age and sex are not listed as special category data
- disability, pregnancy and maternity and gender reassignment may implicitly be special category data insofar as the information is relevant to a person’s health - data concerning health is special category data under the UK GDPR
It is important for public authorities, or the organisations they contract to deliver a public function, to be mindful that they may also be processing ‘proxy data’ that can be associated with special category data and / or protected characteristics under the Equality Act 2010.
Proxy data is data that in practice operates as proxies for other attributes, such as special category data and protected characteristics under the Equality Act 2010. Some attributes can be used to accurately infer a protected characteristic. For example, someone’s age can be inferred by their date of birth. Other attributes may be used to deduce a protected characteristic with varying degrees of accuracy - For example, postcodes may be used to deduce likely ethnicities.
Examples of proxy data for protected characteristics
Protected Characteristics |
Examples of associated proxy data |
Disability - including physical and mental impairments |
Special Educational Need (SEN) status in England Additional Support for Learning (ASL) status in Scotland Additional Learning Needs (ALN) status in Wales In receipt of Disability Living Allowance In receipt of Employment and Support Allowance |
Race - including colour, nationality, citizenship, ethnic or national origins |
Name Surname Address Postcode Facial image |
Religion or belief |
Name Surname Address Postcode Donation to a religious organisation Attendance at a particular place of faith Image - religious dress |
Sexual orientation |
Civil partnership status - although this is now increasingly irrelevant given that heterosexual couples can also get a civil partnership |
Pregnancy and maternity |
In receipt of Maternity Allowance benefit |
Age |
Date of Birth |
Sex |
Name / Forenames Title |
How can public authorities ensure compliance with the UK GDPR and the PSED when processing data?
To ensure the processing of personal data is lawful, public authorities need to identify an Article 6 basis for processing under the UK GDPR for example, where the processing is necessary for carrying out a statutory function, or where it is based on an individual’s consent.
To ensure the processing of special category data (or proxies for special category data) is lawful, public authorities will also need to meet one of the specific conditions within Article 9 of the UK GDPR in addition to an Article 6 lawful basis. For more details on the lawful processing and special category data conditions, read the ICO’s What are the conditions for processing? guidance.
To ensure compliance with Data Protection law, public authorities will also need to identify and mitigate the data protection risks of processing personal data, including special category data. Read the ICO’S guidance on how to do this.
To ensure compliance with the PSED, public authorities should carefully consider the equality impact of processing any personal data where it can be used to identify individuals as having a protected characteristic. This can be either directly or indirectly / by proxy. They will have to consider how processing such data could help them:
- eliminate discrimination and other unlawful conduct under the Equality Act 2010
- advance or worsen equality of opportunity
- foster good relations or lead to community tensions
Public authorities must consider the potential equality implications (positive or negative) of processing data that relates to any protected characteristics before they make the decision to do so. Once a public authority has decided to process data that relates to one or more protected characteristics, it must also monitor its actual equality impact regularly during implementation.
We have produced guidance to help public authorities consider the equality impact of their policies.
- Read the guidance for public authorities in England
- Read the guidance for public authorities in Scotland
- Read the guidance for public authorities in Wales
The responsibility for ensuring that equality impact is thoroughly assessed and monitored lies with public authorities. This is the case even if they contract the processing of data to a third-party organisation. For example, this might be done by commissioning a private contractor that processes data using AI-based technologies. This has been highlighted in the R (Bridges) v Chief Constable of South Wales Police case – see paragraphs 191, 200 and 201.
Data processing and AI
It is essential for public authorities to ensure compliance with both data protection and equality law when they process proxy data that can be associated with special category data and / or protected characteristics under the Equality Act 2010. Read more about this in the ICO’s guidance on AI and Data Protection – What about fairness, bias and discrimination?
Processing proxy data to deduce special category data will, in most cases, be the same as processing special category data. This means all the relevant data protection considerations will apply, including the need to find a valid Article 9 condition. Data protection principles - such as data minimisation - will need to be complied with, regardless of whether or not the personal data is special category data or proxies for another attribute. Processing such proxy data without careful consideration and regular monitoring of its equality impact can lead to unlawful discrimination in ways that may not immediately be obvious. These considerations can have potentially significant consequences, both for the public authority and individuals.
An example of this was recently exposed in the Netherlands. Tax authorities used an AI model and proxy data to identify benefit fraud. This led to unlawful discrimination on the basis of race, nationality and religion.
To avoid this, the ICO advises that organisations should conduct a ‘proxy analysis’ of AI models. This should detect whether any features of the model are proxies of protected characteristics. For example, if a model detects correlations between welfare benefit fraud risk scores and carers’ days off work, it may unjustifiably lead to women disproportionately being targeted for fraud investigation. This is because women tend to have more caring obligations, so carers’ days off work is operating as a proxy for sex. Having detected a proxy, you should ascertain if you need to remove or adjust the feature to avoid any false correlations. Read more about ‘proxy analysis’ here.
Further information
Penalties for breaching the Data Protection Act
For serious breaches of data protection law, the Information Commissioner can impose a financial penalty up to £17.5 million or 4% of the total annual worldwide turnover in the preceding financial year, whichever is higher.
More information about data anonymisation
Anonymisation is the process of converting data into a form where the identification of individuals is unlikely to take place.
The ICO has published a code of practice on this: Anonymisation: managing data protection risk. This covers the anonymisation of personal information and the disclosure of data once it has been anonymised. The code includes case studies and examples of anonymisation techniques.
More information on data protection
The ICO has guidance on:
- the general GDPR principles
- the lawful basis for processing data
- special category data
- AI and data protection
- law enforcement processing
More information about equality monitoring in employment
Appendix 2 of our Statutory Code of Practice for Employment provides guidance on equality monitoring in the workplace. The guidance may also be relevant to equality monitoring of service users. The ICO also has produced guidance on data protection considerations when handling employees’ information, which is relevant.
The relationship between data protection law and freedom of information
As well as responding to requests for information, you must publish information proactively. The Freedom of Information Act 2000 and the Freedom of Information (Scotland) Act 2002 requires public authorities to have a publication scheme, and to publish information covered by the scheme.
The ICO Guide to Freedom of Information explains how the UK Freedom of Information Act 2000 affects data protection. The Scottish Information Commissioner’s website provides the equivalent guidance for Scottish public authorities.
Case study: The Dutch benefit scandal
How using AI-based technologies and proxy data can lead to unlawful discrimination
Context
Over the last decade, the Dutch tax authorities have used new predictive AI-based technology to identify benefit fraud. This has led to unlawful discrimination on the basis of race, nationality and religion. It has had serious consequences for both individuals and institutions.
What happened?
As many as 26,000 individuals were wrongly accused of child benefit fraud over a period of six years. Many families were forced to repay tens of thousands of euros, leaving them in debt and poverty. Some also lost their home or job as a consequence.
Individuals affected contacted the media and several investigations were launched. This included an investigation by the Dutch Data Protection Authority (DPA). It concluded that the risk-classification model used by the tax authorities involved improper processes and amounted to unlawful discrimination. This was because the nationality of applicants was used to select who would be investigated, without any further indication that they had committed fraud.
New investigations following the scandal showed that the problems stretch wider than child benefits. Another automated risk profiling technology was used by the Dutch tax administration to identify potential income tax fraud. Proxy data for race and religion, such as ‘surnames that end with -ic’ and ‘financial contributions to a Mosque’, were used as indicators for potential fraud. This was again found to amount to unlawful discrimination.
What were the consequences?
In 2021, the Dutch government resigned over the scandal. The financial costs to compensate the victims of the scandal amount to at least £6.2 billion. The DPA fined the Dutch tax authorities more than £5.1 million for the use of these algorithms. They cited the ‘unlawful, discriminatory and therefore improper manner’ of processing data.
In 2021, the Netherlands Institute for Human Rights conducted its own investigation after receiving multiple complaints from victims. It concluded that there was sufficient evidence to make a presumption of unlawful indirect discrimination on the grounds of race or ethnicity. The institute is now using its powers to give rulings for each of the complainants. The Dutch government must now show that it has not discriminated. The first three rulings have been published, concluding unlawful indirect discrimination in all three cases. They recommend conducting human rights impact assessments before algorithms are used in practice.
Page updates
Published:
12 September 2024
Last updated:
12 September 2024