Site icon Tech-Wire

Top 10 police technology stories of 2024

Linux 1200 × 1200 px

Top 10 police technology stories of 2024

Here are Computer Weekly’s top 10 police technology stories of 2024

Throughout 2024, UK government and law enforcement rhetoric around technology largely focused on the role of automation in reducing cost while boosting efficiency and productivity. Since the ascent of the new Labour government in July, there has also been a renewed focus on law and order given its manifesto commitment to “take back our streets”.

This has translated to expanding the role of various technologies throughout policing, particularly facial-recognition and cloud-based artificial intelligence (AI) tools. However, many of these deployments – as with previous years – are plagued by data protection issues and ethical concerns.

Computer Weekly’s coverage also considered how the government’s new data reforms could further reduce transparency and oversight around police technology, and challenged the assumption that people in the UK are policed by consent, given how little say they have over the technologies being deployed with taxpayer money in public spaces.

1. Spring Budget risks funding legally questionable police tech

In March, then-chancellor Jeremy Hunt committed £230m to police forces to pilot or roll out a host of productivity-boosting technologies, including live facial recognition (LFR), automation and AI and the use of drones as potential first responders.

Pre-briefings of the government’s technology plans to journalists revealed that automated redaction technologies would be a priority, so that personal information can be removed from documents or irrelevant faces can be blurred out from body-worn video footage.

Hunt also committed to providing a further £75m to the roll-out of Violence Reduction Units and hot spot policing tactics, the latter of which largely revolves around the use of data to target police resources and activities to areas where crime is most concentrated.

However, lingering concerns around the legality of how UK police are deploying cloud infrastructure and AI-powered facial recognition could undermine the effectiveness of the investment. In the case of facial recognition, there have been repeated calls for new biometric-focused legislation from a wide range of actors due a lack of clear rules controlling its use.

Given the focus on cloud migrations, as well as the computing power and storage required to effectively use AI, data protection experts told Computer Weekly that many of the new AI tools being deployed will be hosted on this US-based cloud infrastructure, opening them up to potential legal compliance challenges as well.

2. Microsoft admits no guarantee of sovereignty for UK policing data

In June, Computer Weekly reported on documents that showed Microsoft admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure.

Released under freedom of information (FoI) rules, the documents were related to the deployment of Police Scotland’s Digital Evidence Sharing Capability (DESC), which Computer Weekly first reported was rolled out with major data protection issues in April 2023.

The disclosure revealed that data hosted in Microsoft’s hyperscale public cloud infrastructure is regularly transferred and processed overseas; that the data processing agreement in place for the DESC did not cover UK-specific data protection requirements; and that while the company may have the ability to make technical changes to ensure data protection compliance, it is only making these changes for DESC partners and not other policing bodies because “no one else had asked”.

The correspondence also contains acknowledgements from Microsoft that international data transfers are inherent to its public cloud architecture. As a result, the issues identified with the Scottish Police will equally apply to all UK government users, many of whom face similar regulatory limitations on the offshoring of data.

The same set of disclosures also revealed that Police Scotland chose not to formally consult with the data regulator about the risks identified with the system, while the Information Commissioner’s Office (ICO) itself did not follow up on why there was no formal consultation until nearly three months after live deployment, despite being in full view of the risks.

The disclosures also revealed the ICO’s advice to Police Scotland about how to make the cloud deployment legal. While it highlighted some potential transfer mechanism the regulator believes can ensure compliance, it was explicit that the guidance “does not constitute approval for the roll-out or assurance of compliance”.

3. Met Police to scrap and replace ‘racist’ Gangs Violence Matrix

In February, the Met Police announced it scrapping its controversial Gangs Violence Matrix (GVM) database after long-standing concerns over the tool’s racial disproportionality

Set up in 2012 as a part of the government’s self-declared and evidence-free “war on gangs” in the wake of the Tottenham riots, the secretive GVM was being used by the Met to identify, monitor and target individuals the force considers to be involved in gang violence.

Two separate investigations by the UK information commissioner’s office (ICO) and Amnesty International from 2018 found that the GVM disproportionately affected people from minority ethnic communities, with 78% of those listed in the database at the time being black (compared with 27% of people convicted for serious youth violence-related offences being black).

They also found that 40% of those listed on the matrix had a “harm score” of zero applied by the algorithm, meaning police had no record of them being involved in a violent offence, while 64% of all individuals had been labelled green (the lowest risk category). Some 75% of all people named on the matrix were found to be victims of violence themselves.

However, human rights groups warned that its replacement is likely to repeat the same mistakes, and criticised the Met for how long it took the force to stop using the system.

4. Starmer announces tech-enabled crackdown on people smuggling

In November, UK prime minister Keir Starmer committed an extra £75m to the recently established Border Security Command (BSC) to fund its acquisition and use of “state-of-the-art surveillance equipment”, as part of a wider clampdown on the “national security threat” of people smuggling gangs

The new investment in border security builds on £75m the UK government previously committed to the BSC in September 2024, which focused on unlocking “sophisticated new technology and extra capabilities”, such as covert cameras, monitoring technologies, new intelligence units, and improving intelligence and information flows between law enforcement bodies.

This means the overall investment into the BSC – which was set up in July 2024 to coordinate the work of the National Crime Agency (NCA), intelligence agencies, police forces, Immigration Enforcement and Border Force – will be £150m over the next two years.

However, some charities have criticised the government’s focus on enforcement, noting it could lead to desperate people taking more dangerous and deadly journeys. They suggested that, instead, the government should focus on creating safe and legal routes for refugees to enter the UK, which are currently extremely limited.

5. MPs hold first ever debate on live facial recognition

MPs held an open debate on police use of live facial-recognition technology in November for the first time since it was initially deployed by the Met Police in August 2016, with a consensus emerging on the need for it to be regulated by a specific law, rather than the patchwork of legislation and official guidance that currently governs police deployments.

The MPs – including members of both front benches – discussed a range of issues associated with the technology, including the impacts of LFR surveillance on privacy; problems around bias, accuracy and racial discrimination; the lack of a clear legal framework governing its use by police; and how its wider roll-out could further reduce people’s dwindling trust in police.

The majority of MPs involved in the debate also openly lamented why there had been no debate about the use of the technology by police up until now.

There have been repeated calls from Parliament and civil society for new legal frameworks to govern law enforcement’s use of the technology in recent years. These include three separate inquiries by the Lords Justice and Home Affairs Committee (JHAC) into shoplifting, police algorithms and police facial recognition; two of the UK’s former biometrics commissioners, Paul Wiles and Fraser Sampson; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

During his time in office before resigning in October 2023, Sampson also highlighted a lack of clarity about the scale and extent of public space surveillance, as well as concerns around the general “culture of retention” in UK policing around biometric data.

However, the Home Office and policing bodies have repeatedly maintained there is already a “comprehensive legal framework” in place.

6. Police cloud project raises data protection concerns despite legal reforms

Nine police forces are seeking to replace their common records managements system (RMS) with a cloud-based alternative – but despite upcoming changes to the UK’s data laws, experts told Computer Weekly the strong likelihood of a US-based hyperscaler winning the contract presents continued risks.

Under the UK’s current data regime, moving sensitive police records to one of the US cloud giants introduces major data protection issues. However, the government’s recently proposed data reforms – which would most likely eliminate many of these risks by allowing routine transfers to hyperscalers – could jeopardise the UK’s ability to retain its law enforcement data adequacy with the EU, while issues around data sovereignty would still persist.

To avoid falling into the same situation with the new cloud-based RMS, the experts made a number of suggestions about the steps the forces’ should be taking now as data controllers, before the procurement progresses further down the line.

While the government’s new Data Use and Access Bill (DUAB) is set to the change legal rules around law enforcement processing in a way that would unequivocally allow routine data transfers to hyperscalers, the experts say doing so could still risk the UK’s ability to retain its law enforcement adequacy with the European Union (EU) when it comes up for renewal in June 2025.

7. Met Police challenged on claim LFR supported by ‘majority of Lewisham residents’

The Metropolitan Police claimed that its LFR deployments in Lewisham are supported by the majority of residents and local councillors, but a community impact assessment (CIA) later obtained by Computer Weekly shows there has been minimal direct consultation with residents, while elected officials continue to express concern.

In August 2024, Lewisham councillors complained there had been no engagement with the local community ahead of the controversial technology being deployed in the area, with the Met announcing the tech would be used in Tweet just a month after being urged by councillors to improve its community engagement around LFR.

Responding to Computer Weekly’s questions about the concerns raised by Lewisham councillors, a Met Police spokesperson said at the time that its LFR deployments “have been very much supported by the majority of Lewisham residents, business owners and political representatives – namely Lewisham councillors”.

However, according to the CIA obtained under Foi rules by Computer Weekly, the only mention of “residents” in the entire document is when detailing the press response given to Computer Weekly.

Both elected officials and campaigners criticised the Met’s community engagement approach, saying the force was pushing ahead with deploying LFR despite opposition; asking leading questions during the minimal consultation carried out; and using the CIA process as a rubber stamp.

8. Metropolitan Police officer dismissed for unlawfully accessing Sarah Everard files

A Metropolitan Police officer was dismissed after repeatedly accessing sensitive files related to the disappearance and murder of Sarah Everard while off-duty, prompting concern that legal requirements around police data access – which are due to be removed by the government’s data reforms – are not being followed.

The Met said a total of 104 officers and staff (68 officers and 36 staff) were initially identified as potentially accessing files relating to the investigation without a legitimate policing purpose, which resulted in seven officers being served with gross misconduct notices and appearing in front of a hearing. Ultimately, two thirds of the staff and officers involved had action taken against them.

campaigners and privacy experts said these situations would be made more likely if the government’s Data Use and Access Bill (DUAB) passes, as it’s set to remove the police logging procedure that requires forces to keep records detailing how information is accessed and used.

This includes recording a justification for why an individual officer has accessed a particular piece of information, although according to the DUAB’s explanatory notes, officers and staff will still be legally expected to log the time, date and, “as far as possible”, their identity when accessing information.

Liberal Democrat peer Lord Clement-Jones had previously told Computer Weekly that the removal of police logging requirements was “egregious”, and that it represents a potential divergence from the European Union’s Law Enforcement Directive (LED) that could prevent the UK from renewing its LED data adequacy decision.

9. Automated police tech contributes to UK structural racism problem

In their joint submission to the United Nations’ (UN) Committee on the Elimination of Racial Discrimination – which independently monitors states’ efforts to eradicate racism and promote human rights – the Runneymede Trust and Amnesty International said the use of AI and facial-recognition technologies in policing is contributing to a “worrying rowback” in the civil and political rights of people of colour in the UK.

The submission noted, for example, that despite the propensity of LFR to mis-identify people of colour, the Home Office has previously affirmed the right of police forces to use it in existing legal frameworks, and that use of the tech is generally ramping up.

It noted that the use of automated systems such as predictive policing and Automated Number Plate Recognition (ANPR) by police can result in human rights violations and fatalities.

The submission further highlighted the discriminatory outcomes of the Met’s Gangs Matrix database, which resulted in people of colour (predominantly young Black boys and men, in this case) being racially profiled for the music they listen to, their behaviour on social media, or who their friends are.

In its recommendations for the UK government on police AI, the civil society groups said it should prohibit all forms of predictive and profiling systems in law enforcement and criminal justice (including systems which focus on and target individuals, groups, and locations or areas); provide public transparency and oversight when police or migration and national security agencies use “high-risk” AI; and impose legal limits to prohibit uses of AI that present an unacceptable risk to human rights.

10. Campaigners criticise Starmer post-riot public surveillance plans

Prime minister Keir Starmer announced that the government will establish a “national capability” to deal with violent disorder in the wake of racist rioting across England in August, but campaigners and civil society groups said they are concerned about the surveillance implications of the initiative and its damaging effect on wider civil liberties.

Known as the National Violent Disorder Programme, the government said that it will bring together the best policing capabilities from across the UK to share intelligence on the activity of violent groups so that authorities can swiftly intervene to arrest them.

The programme announcement follows the outbreak of violent far-right riots in more than a dozen English towns and cities, which specifically targeted mosques, hotels housing asylum seekers, immigration centres, and random people of colour.

Despite acknowledging the clearly far-right nature of the current disorder during a press conference announcing the programme, Starmer also said that the new initiative would be used to identify agitators from all parts of the ideological spectrum.

According to the Network for Police Monitoring (Netpol) – which monitors and resists policing that is excessive, discriminatory or threatens civil liberties – the programme will clearly be used to target organised anti-fascists that went out to oppose the far-right across the country on Wednesday 7 August.

“The prime minister actively supports the police’s dismissal of objections to the intrusion, unreliability and discriminatory nature of facial recognition, and the infringement on civil liberties that the extensive filming of demonstrations entails. We may also see more public order legislation and a greater willingness to quickly deploy riot units onto the streets,” it said.

“After the immediate crisis recedes, expanded police surveillance is just as likely to focus on movements for social, racial and climate justice as it does on the far right.”

Read more about police technology

Originally published at ECT News

Exit mobile version