Court of Appeal: Police Force’s use of automated facial-recognition technology unlawful
The Court of Appeal has ruled the South Wales Police Force’s use of automated facial recognition technology was unlawful (R (Bridges) v Chief Constable of South Wales Police and others).
The South Wales Police Force used live automated facial recognition (AFR) technology on around 50 occasions between May 2017 and April 2019. It used it as part of a pilot project at large public events. The technology compared live digital images of members of the public against digital images on a watchlist compiled by the Force. The watchlist included people wanted on warrants and suspected of crimes.
The AFR technology processed special category data (biometric information).
Mr Bridges brings claim
Mr Bridges, a civil liberties campaigner (not on the watchlist), claimed the use of AFR technology was unlawful. He argued it breached Article 8 of the European Convention on Human Rights (right to respect for private and family life), the Data Protection Act 2018 (DPA 2018) and the public sector equality duty under the Equality Act 2010.
Divisional Court upholds the Force’s AFR use
The Divisional Court rejected Mr Bridges’ claims. It ruled that, while the AFR technology engaged the right to respect for private and family life, the Force had used it within a legal framework and in accordance with the law.
It also ruled the processing lawful under the DPA 2018 and that the Force had not breached the Equality Act 2010.
Mr Bridges appealed to the Court of Appeal.
Court of Appeal partly overturns decision
The Court of Appeal partly upheld the appeal. It ruled the Force’s use of AFR technology was unlawful.
The Court noted there was no clear guidance on where AFR could be used and who could be put on a watchlist. The Force’s discretion was too broad to be “in accordance with the law” under the European Convention on Human Rights.
The Court also ruled the Force’s Data Protection Impact Assessment inadequate. It failed to properly assess the impact of AFR on the rights and freedoms of data subjects. It also failed to address potential measures to mitigate the risks identified.
The Court also decided the Force had not done all it reasonably could to fulfil its public sector equality duty, therefore breaching the Equality Act 2010. The duty ensures that public authorities do not inadvertently overlook the potential discriminatory impact of any new, seemingly neutral, policy. The Force had not investigated whether AFR had an unacceptable bias on grounds of race or gender.
The Force is not appealing the decision. Its Chief Constable said: "The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development. I am confident this is a judgment that we can work with”.
The ICO has since issued a statement welcoming the judgment for clarifying AFR technology’s use in public places in compliance with data protection legislation.
AFR is controversial. This judgment highlights the data privacy issues. Any use must be proportionate and lawful. Further challenges are likely as organisations using this technology, such as police forces, adapt to this ruling.
The judgment considered the data claims under the DPA 2018, not the DPA 1998. This is welcome – while the specific processing on Mr Bridges took place under the DPA 1998, considering the DPA 2018 means the judgment can be used with AFR technology moving forwards.
The DPA 2018 requires controllers, in most scenarios, to have an “appropriate policy document” for processing special category data and criminal data. While Mr Bridges’ specific processing took place pre-DPA 2018 (so was not considered in the judgment), the judgment clarifies the need for such a document for processing special category data.
Controllers should note the Court’s criticism of the Force’s Data Protection Impact Assessment. A Data Protection Impact Assessment must carefully assess the potential impact of the new proposed processing, demonstrate that there is not a less intrusive way to collect data and demonstrate that steps to mitigate any risks to individuals’ rights were considered.
The articles published on this website, current at the date of publication, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your own circumstances should always be sought separately before taking any action.