MPs hold first ever debate on live facial recognition
MPs have held an open debate on police use of live facial recognition technology for the first time since it was initially deployed by the Met Police in August 2016
MPs have debated police use of live facial recognition (LFR) for the first time, with a consensus emerging on the need for it to be regulated by a specific law, rather than the patchwork of legislation and official guidance that currently governs police deployments.
Throughout the Westminster Hall debate on 13 November 2024, MPs – including members of both front benches – discussed a range of issues associated with the technology, including the impacts of LFR surveillance on privacy; problems around bias, accuracy and racial discrimination; the lack of a clear legal framework governing its use by police; and how its wider roll-out could further reduce people’s dwindling trust in police.
While there were differences of opinion about the efficacy of LFR as a crime-fighting tool, MPs largely agreed there are legitimate concerns around its use by police, with a consensus emerging on the need for proper regulation of the technology.
The majority of MPs involved in the debate also openly lamented why there had been no debate about the use of the technology by police up until now.
While there has been limited Parliamentary scrutiny of LFR in the form of written questions and answers over the years, the debate – called by Conservative MP John Whittingdale – marks the first time MPs have openly discussed police use of LFR in the eight years since it was first deployed by the Metropolitan Police at Notting Hill Carnival in August 2016.
Since that initial deployment, there have been repeated calls from Parliament and civil society for new legal frameworks to govern law enforcement’s use of the technology. These include three separate inquiries by the Lords Justice and Home Affairs Committee (JHAC) into shoplifting, police algorithms and police facial recognition; two of the UK’s former biometrics commissioners, Paul Wiles and Fraser Sampson; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.
During his time in office before resigning in October 2023, Sampson also highlighted a lack of clarity about the scale and extent of public space surveillance, as well as concerns around the general “culture of retention” in UK policing around biometric data.
However, the Home Office and policing bodies have repeatedly maintained there is already a “comprehensive legal framework” in place, which consists of the Police and Criminal Evidence Act (PACE) 1984; the Data Protection Act 2018; the Protection of Freedoms Act 2012; the Equality Act 2010; the Investigatory Powers Act 2000; the Human Rights Act 1998; and common law powers to prevent and detect crime.
The debate concluded with policing minister Diana Johnson outlining the new Labour government’s position on police LFR use, noting that while the technology has “the potential to be transformational for policing”, there are also “legitimate concerns” around its use, “including misidentification, misuse, the effect on human rights, and individual privacy”.
Further noting that the Met’s use of LFR has so far resulted in 460 arrests this year – including more than 45 registered sex offenders for breaching their conditions – Johnson said it was important to be clear that facial recognition is already governed by data protection, equality and human rights law, as well as common law powers and guidance from the College of Policing.
“This government wants to take time to listen and to think carefully about the concerns that have been raised, as well as how we are best able to enable the police to use live facial recognition in a way that secures and maintains public confidence,” she said.
“In considering its present and future use, we must balance privacy concerns against the expectations that we place on the police to keep our streets safe… I am therefore committed to a programme of engagement over the coming months to inform this thinking.”
Johnson added that following on from initial conversations with policing – in which senior officers said the lack of a clear framework was inhibiting their use of the technology – the government would be holding a series of roundtables with regulators and civil society groups before the end of the year to inform its thinking going forward.
Accuracy issues
On the issue of accuracy and bias, shadow home secretary Chris Philp – who when in government pushed for much greater police use of the technology and called for LFR watchlists to be linked to the UK’s passport database – highlighted a recent study by the National Physical Laboratory (NPL) that found “no statistically significant” racial bias with police LFR systems when used in certain settings.
“When this technology was first introduced about seven years ago, there were reports – accurate reports – that there was racial bias in the way the algorithm at the time operated,” said Philp. “The algorithm has been developed a great deal since those days, and it has been tested definitively by the National Physical Laboratory, which is the nation’s premier testing laboratory.”
He noted that the NPL specifically found that when deploying the Neoface V4 facial recognition software provided by Japanese biometrics firm NEC, both the Met and South Wales Police could achieve “equitable” outcomes across gender and ethnicity by setting the “face-match threshold” to 0.6 (with zero being the lowest similarity and one indicating the highest similarity).
However, other MPs contested this, noting that while the accuracy may well have been improved by police using LFR at that particular threshold, there are no rules in place to stop them lowering that threshold at any time.
“There’s no such thing as no misrepresentations, or people who are not wrongly identified, and it’s also very easy for a police service to lower that number because we have no judicial oversight of it,” said Labour MP Dawn Butler, who later added that even set at 0.6, the LFR software was still less accurate than so-called police “super-spotters”, which are specialist officers trained to identify people quickly in crowds.
“There could be a case where a police service is trying to prove this system they bought is value for money, and you can imagine a police officer not getting many hits at 0.6, lowering it to 0.5 so they can get more hits, which, in turn, will mean more people are misidentified.
“There should be regulation around this issue. Taking away somebody’s liberty is one of the most serious things we can do in society, so we need to think very carefully if we’re going to introduce something that accelerates that.”
Racial bias and trust
Lambeth Labour MP Bell Ribeiro-Addy also argued that even when using this threshold there is room for error, especially if images in a police database or pulled from publicly available sources online are mislabelled.
Bell Ribeiro-Addy, Labour MP for Lambeth
“It’s almost inevitable that images will be mislabelled and innocent people will be subject to needless run-ins with the police… The Metropolitan Police’s own testing of its facial recognition algorithm identified disproportionately higher inaccuracy rates when attempting to identify people of colour and women,” she said, highlighting a 2023 study by civil liberties group Big Brother Watch which found that over 89% of all UK police LFR alerts since the technology was introduced have wrongly identified members of the public as people of interest.
“People of colour are already disproportionately stopped and searched at higher rates, and the use of potentially flawed technology will only serve to increase the rate at which ethnic minorities are stopped, searched and possibly even incorrectly detained.”
She further argued that increasing stops via LFR could further dampen trust in the police, particularly among ethnic minority communities, which are already over-policed and under-served, and that allowing police to collect and check other types of biometric information like DNA or fingerprints from people in the street would not be accepted: “Why should we look at this intrusive, automated biometric software any differently?”
While police and their LFR suppliers claim that people’s biometric data is deleted instantly if they do not match any images contained in the watchlist, independent MP Iqbal Mohammed highlighted how Google’s incognito browser was supposed to be “very private” until it was discovered the company was storing that data in breach of UK data laws. “Companies telling you things are immediately deleted is not always true,” he said.
Other MPs variously claimed that the further roll-out of LFR would “exacerbate existing inequalities and discrimination”, “cause further division and mistrust of the police” and “undermine several of our fundamental rights”, including rights to privacy, freedom of assembly and expression, and non-discrimination.
Judicial oversight and specific legislation
Conservative MP David Davis highlighted the need for judicial oversight of the technology and specific legislation laying out clear rules for its use, arguing that it should not be left to non-statutory guidelines or police discretion.
Multiple MPs also spoke about the future of LFR, including the potential for mission creep and the possibility of linking the UK’s six million CCTV cameras to facial recognition software, arguing there needs to be laws in place before the technology is used even more widely.
“The technology is prone to slippage. Way back when … we introduced automatic number plate recognition [ANPR] to monitor IRA terrorists coming from Liverpool to London. That was its exact purpose, but thereafter it got used for a dozen other things, without any legislative change or any approval by Parliament,” said Davis.
Mohammed also said the lack of specific legislation and judicial oversight created huge room for police misuse and overreach, and further highlighted how it could easily be leveraged to undermine civil liberties around the right to protest. “Facial recognition can deter individuals from participating in protests or public gatherings due to the fear of being monitored or identified,” he said.
Bobby Dean, Liberal Democrat MP
Even while noting that LFR accuracy had improved greatly since the Met’s initial deployment in 2016 and recognising the operational benefits it can bring police, Philp also emphasised the need for legislative control and judicial oversight of the technology.
“In Croydon, [LFR] has resulted in approximately 200 arrests of people who would not otherwise have been arrested, including for things like Class A drug supply, grievous bodily harm, fraud, domestic burglary… [and] a man who had been wanted for two rapes,” he said. “They’d still be walking free if not for this technology.”
Philp added while it is “not true to say there is a complete vacuum as far as rules and regulations are concerned … there is merit in clarifying at a national level where these guidelines sit”.
He said while he would not want to see the UK go as far as the European Union has with its Artificial Intelligence Act (AIA) – which has banned remote biometric identification in a range of circumstances – on the basis it would allow criminals to go free, he sees using some sort of “regulation-making power” as a more sensible approach than new primary legislation.
One MP – Liberal Democrat Bobby Dean – explicitly called for a complete halt on police deployments of the technology, at least until primary legislation is in place to control its use: “I think it’s clear from this room today, there are many, many doubts, so we should probably be thinking about halting the use of this technology until we’ve cleared up those doubts.”
Read more about facial recognition technology
- Police defend facial recognition target selection to Lords: Senior police officers confirm to Lords committee that facial recognition watchlist image selection is based on crime categories attached to people’s photos, rather than a context-specific assessment of the threat presented by a given individual
- Met Police deploy LFR in Lewisham without community input: The Met’s latest live facial recognition deployment in Catford has raised concerns over the lack of community engagement around the police force’s use of the controversial technology.
- ICO reprimands Essex school for illegal facial recognition use: The Information Commissioner’s Office has reprimanded Chelmer Valley High School in Chelmsford for introducing facial recognition and failing to conduct a legally required data protection impact assessment and obtain the explicit consent of students.
Originally published at ECT News