Julian Hayes and Michael Drury, partners in the Privacy Group at BCL Solicitors, analyse the impact of the seminal judgment handed down on 4th September 2019 in a claim for judicial review brought by Mr Edward Bridges in the High Court (sitting at Cardiff). In the first legal challenge of its kind anywhere in the world, the Court ruled that the South Wales Police use of live facial recognition technology (‘FRT’), which has been trialled since 2017 was lawful even though no specific FRT law was or is in existence.
An insidious threat to privacy or a step-change in police ability to identify and catch known and suspected offenders? Broadly those are the opposing standpoints framing the debate about the increasing use of facial recognition technology (FRT), which uses algorithms to compare live images of people’s faces with photographs held on a database - or “watch list” - to identify individuals “of interest.” FRT therefore presents twin issues concerning the collection of data of the “innocent” and the labelling of those of interest.
The utility of FRT is claimed to vary from identifying criminals, locating missing children and even informing pub and bar staff who is next in line to be served. However, FRT is still in a juvenile state and studies tend to show it is prone to error. Concern has also been expressed at the perceived lack of an adequate regulatory framework governing its deployment in the UK, and the consequent risk that it might be abused and lead to miscarriages of justice. This concern extends beyond NGOs to regulators charged with overseeing the technology’s operation.
Last week, in judicial review proceedings brought by former councillor Ed Bridges, represented by campaign group Liberty, the courts of England and Wales - and for the first time anywhere in the world - gave their judgment on the laws governing FRT. The judges’ decision is being interpreted by some as a “green light” for further roll-out of the technology.
With interventions in the court proceedings from the Information Commissioner (ICO) and the Surveillance Camera Commissioner, Mr Bridges challenged the use of FRT by the South Wales Police in two of several ongoing trials taking place in England and Wales. He argued that the use of FRT by the police unlawfully and disproportionately interfered with his right to a private life under the European Court of Human Rights, that it breached data protection legislation, and contravened the anti-discrimination requirement imposed on public authorities by the Equalities Act 2010.
Highlighting the crime-fighting efficacy of the technology during the trials, the judges drew attention to the measures which South Wales Police had taken to alert the public to the experiment. They listed the safeguards which had been put in place, including the automatic deletion of the biometric data of anyone not on the watchlist and the “human” confirmation of “matches” identified by the FRT algorithm before police officers took any action.
In contrast to similar technology in the US, where estimates suggest up to half of all adults are enrolled on face-recognition databases, the watchlist used by the South Wales Police comprised only those who had escaped from justice, were suspected of offences, were missing or vulnerable, or whose presence at a particular event caused concern. Notably, the watchlist had been created by reference to individuals anticipated to be in the locality, so those hoping for a universal database as a crime-detection panacea will be disappointed. The court believed including a person on a watchlist without adequate justification would most likely be unlawful and could give rise to future legal challenges.
Despite concerns previously expressed about FRT by privacy campaigners, regulators and MPs in a report critical of FRT published in July 2019, the judges in this case found that the police were operating in a proportionate manner under their common law powers. They were acting within a clear legal framework of data protection legislation, which, taken with adherence to relevant codes and policies, meant there was no unlawful interference with Mr Bridges’ human rights and nor was there any breach of the Data Protection Act 2018. Although some studies of FRT have reported misidentification of women or ethnic groups with darker skin, there was no evidence of such errors by this particular type of FRT technology which might make it discriminatory.
Given the fact-specific nature of the High Court’s judgment, its wider implications for the use of FRT by law enforcement agencies are unclear. Mr Bridges has already announced his intention to appeal the court’s decision, making firm conclusions about the judgment at this stage even more difficult to extrapolate. In the aftermath of the ruling, South Wales Police cautiously welcomed the outcome but the Home Office was quick to laud what it claimed was the technology’s demonstrable ability to tackle crime and identify criminals in an efficient and otherwise impossible way, freeing up resources to protect communities. Given the risks of error identified in trials elsewhere, with the potential for it to destabilise community relations with the police, it remains debatable whether the FRT yet warrants such praise, and certainly in the absence of its deployment with great care and precision.
Save for confirming that FRT, whether deployed by private or public organisations, engages data protection legislation which must be complied with by all users, the judgment also has little impact on the increasingly widespread use of FRT by private entities in quasi-public spaces such as shops and retail parks. Such use has given rise to much media debate and thrown up significant legal and ethical issues. The European Commission has recently announced plans to regulate FRT as a discrete activity. Were that to happen, the default would be that, despite the UK’s imminent departure from the EU, the UK government would adopt the EU’s proposals to ensure continued UK-EU regulatory alignment in the data protection field.
The ICO welcomed the court’s decision in the Bridges case but warned of the risk to public confidence if the technology was used without necessary privacy safeguards. The regulator indicated it would be publishing guidance for police deployment of FRT in future and it is hoped that such guidance will also clarify privacy obligations where co-operation takes place between law enforcement authorities and private FRT operators.
Despite its current limitations, FRT technology cannot now be “uninvented” and its accuracy is bound to continue to improve. With this in mind, the task for regulators, the courts and legislators will be to provide a clear and up-to-date legal framework, accessible both to FRT operators and the public, to ensure that this next-generation technology is used securely and within the boundaries of what is regarded as acceptable by society as a whole.
We anticipate that wider debate, where the bulk of the general public may feel losing further freedom is a worthwhile trade-off for what it perceives as enhanced protection, may leave the regulators and NGOs as the principal opponents to ever-widening use of FRT.
Michael Drury is a partner at BCL Solicitors LLP, expert on surveillance and investigatory powers as well as information law and cybercrime
Julian Hayes is a partner at BCL Solicitors LLP, specialist in corporate and financial crime, computer misuse offences, surveillance and data protection law
Are you a recognised expert in your field? At Euronews, we believe all views matter. Contact us at email@example.com to send pitches or submissions and be part of the conversation