R (Bridges) v CC South Wales Police

[2020] EWCA Civ 1058, Sir Terence Etherton MR, Dame Victoria Sharp P and Singh LJ

The Court of Appeal allowed the Claimant’s appeal against the decision of the Divisional Court that the Defendant had not breached the PSED in its trial of live automated facial recognition technology (“AFR Locate”), by which the Defendant compared images of members of the public captured by surveillance cameras to images of persons on a watchlist. It was argued on appeal that AFR Locate breached Article 8 ECHR, data protection legislation and the PSED. This note is concerned only with the PSED challenge. The decision is an important one in that it underlines the proactive nature of the PSED and the evidence-gathering obligation it places on public authorities.

AFR Locate operates by (1) processing images of subjects on an existing database by extracting their facial features and expressing them as numerical values; (2) taking digital images of others; (3) detecting and isolating those others’ faces; (4) extracting the unique facial features from each facial image; (5) comparing those unique features with the images on the original database (here the watchlist); and (6) generating “matches” from “similarity scores” which reach a threshold value.

The Defendant had used AFR Locate on about 50 occasions between May 2017 and April 2019 at a variety of large public events at which it had deployed CCTV cameras on police vehicles, or on poles or posts. The use of the technology was publicised to members of the public prior to and during each deployment. If, during any deployment, AFR Locate identified a possible match, the images would be reviewed by an operator who, if satisfied of a match, would notify officers stationed nearby who might stop and search or arrest the person suspected of being on the watchlist.

The Claimant, who was supported by Liberty, argued that the Defendant had failed to comply with the PSED because it had failed to consider the possibility that AFR Locate might produce results that were indirectly discriminatory on grounds of sex and/or race by producing a higher rate of positive matches for female and/or for BAME faces. The Divisional Court rejected this claim in part because there was no evidence that AFR Locate operated in a discriminatory way. On appeal it was argued that the Defendant had breached the PSED by failing to recognise the risk of indirect discrimination. The Court unanimously agreed. There was evidence that facial recognition software could create a greater risk of false identifications in the case of women and people from BAME backgrounds. There was no evidence, or allegation, that the particular software utilised by AFR Locate had this flaw. But far from having, as the Divisional Court had suggested, an “air of unreality”, the Claimant’s “contention that there has been a breach of the PSED … seems to us to raise a serious issue of public concern, which ought to be considered properly by” the Defendant.

The Court of Appeal referred to the decisions of the Court of Appeal in R (Elias) v SSD [2006] EWCA Civ 1293, [2006] 1 WLR 3213 and in R (Bracking) v SSWP [2013] EWCA Civ 1345 and of the Supreme Court in Hotak v Southwark LBC [2015] UKSC 30, [2016] AC 811, citing the guidance issued by McCombe LJ in Bracking at §26 and emphasising that the fact that the PSED’s nature as a “duty of process and not outcome” did “not, however, diminish its importance. Public law is often concerned with the process by which a decision is taken and not with the substance of that decision”. The Court referred to the Stephen Lawrence Inquiry Report 1999 and to the discussion of the PSED’s evolution in Karon Monaghan’s Equality Law (2nd ed, 2013), stating that “The reason why the PSED is so important is that it requires a public authority to give thought to the potential impact of a new policy which may appear to it to be neutral but which may turn out in fact to have a disproportionate impact on certain sections of the population” and that, although the PSED “does not require the impossible[, i]t requires the taking of reasonable steps to make enquiries about what may not yet be known to a public authority about the potential impact of a proposed decision or policy on people with the relevant characteristics”.

“182. We also acknowledge that, as the Divisional Court found, there was no evidence before it that there is any reason to think that the particular AFR technology used in this case did have any bias on racial or gender grounds. That, however, it seems to us, was to put the cart before the horse. The whole purpose of the positive duty (as opposed to the negative duties in the Equality Act 2010) is to ensure that a public authority does not inadvertently overlook information which it should take into account.”

The Court of Appeal did not accept, as the Divisional Court had done, that the “human failsafe” component in AFR Locate was sufficient to discharge the PSED which “is a duty as to the process which needs to be followed, not what the substance of the decision should be”, also because “human beings can also make mistakes”, particularly as regards identification. The manufacturer of the software had provided witness evidence for the purposes of the trial that it was trained on “roughly equal quantities of male and female faces” and on “a wide spectrum of different ethnicities … collected from sources in regions of the world to ensure a comprehensive and representative mix”. But the “precise makeup, scale and sources of the training data used” were “commercially sensitive’ and had not been released and cannot be released by the manufacturer and the Defendant had “never sought to satisfy [itself], either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex” despite evidence that automatic facial recognition software “can sometimes have such a bias”.

Nor did the Court of Appeal accept, as the Divisional Court had done, that the Defendant had satisfied the PSED in the context of a “trial process” by “continu[ing] to review events”: “The PSED does not differ according to whether something is a trial process or not. If anything, it could be said that, before or during the course of a trial, it is all the more important for a public authority to acquire relevant information in order to conform to the PSED and, in particular, to avoid indirect discrimination on racial or gender grounds”:

“201. In all the circumstances, therefore, we have reached the conclusion that SWP have not done all that they reasonably could to fulfil the PSED. We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias”.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>