Essex Police has paused its use of live facial recognition (LFR) technology after identifying potential accuracy and bias risks.

The force’s suspension of its LFR system – provided by Israeli biometrics firm Corsight – was revealed in an audit document published by the Information Commissioner’s Office (ICO), which said Essex Police must work to “reduce the risks” identified before continuing with future deployments.

A list of LFR deployments from Essex Police shows the last time the force used the technology was on 26 August 2025, meaning its deployments had already been paused by the time the ICO carried out its audit that November.

While it is currently unclear what specifically prompted the force to suspend its LFR use, Computer Weekly exclusively reported in May 2025 that Essex Police had failed to properly consider its potentially discriminatory impacts, after a “clearly inadequate” equality impact assessment (EIA) was obtained via Freedom of Information rules by privacy campaign group Big Brother Watch.

Experts criticised the document at the time for being “incoherent”, failing to look at the systemic equalities impacts of the technology, and relying exclusively on testing of entirely different software algorithms used by other police forces trained on different populations.

The force was also criticised for “parroting misleading claims” from the supplier about the LFR system’s lack of bias, with the National Institute of Standards and Technology – a body widely recognised as the gold standard for LFR testing, where all of the testing data is publicly shared – holding no information to support the accuracy figures cited by Corsight, or its claim to essentially have the least-biased algorithm available.

Big Brother Watch alleged at the time that these issues taken together meant the force had likely failed to fulfil its public sector equality duty to consider how its policies and practices could be discriminatory.

Independent testing

Responding to the criticisms, the force said at the time that it was continuing to carry out evaluations, noting that both the National Physical Laboratory (NPL) and Cambridge University had been commissioned to conduct further independent testing of its system.

According to the results of that Cambridge study – published 12 March 2026 – the system was more likely to correctly identify men than women, and was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

Matt Bland, a criminologist involved in the study, said: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”

By contrast, the further NPL testing – also published in March 2026 – found black men were most likely to be correctly matched by the system and white men least likely, but noted that the disparity was not statistically significant.

Computer Weekly contacted the force to ask what specifically prompted the LFR suspension decision, including whether it was the study results or previous criticisms of the EIA.

“In line with our commitment to our Public Sector Equality Duty, Essex Police commissioned two independent studies which were completed by academia,” a spokesperson said. “The first of these indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results.

“Based on the fact there was potential bias, the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software,” they added. “We then sought further academic assessment.

“As a result of this work, we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”

Responding to news of the suspension, Jake Hurfurt, the head of research and investigations at Big Brother Watch, said: “Police across the country must take note of this fiasco. AI [artificial intelligence] surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”

Ramping up deployments without debate

While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has already ramped up massively in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.

However, in December 2025, the Home Office launched a 10-week consultation on the use of LFR by UK police, allowing interested parties and members of the public to share their views on how the controversial technology should be regulated.

The department has said that although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly”.

It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.

Before the consultation had even closed, however, the Home Office announced plans for the massive roll-out of AI and facial-recognition technologies as part of sweeping reforms to the UK’s “broken” policing system.  

Under the proposals – announced in late January 2026, nearly three weeks before the consultation closed – the Home Office will increase the number of LFR vans available to police from 10 to 50; set up a new National Centre for AI in Policing – to be known as Police.AI – to build, test and assure AI models for policing contexts; and invest £115m over three years to help identify, test and scale new AI technologies in policing.

‘Panopticon’ vision

In a recent interview with former prime minister Tony Blair, UK home secretary Shabana Mahmood described her ambition to use technologies such as AI and LFR to achieve Jeremy Bentham’s vision of a “panopticon”, referring to his proposed prison design that would allow a single, unseen guard to silently observe every prisoner at once.

Typically used today as a metaphor for authoritarian control, the underpinning idea of the panopticon is that by instilling a perpetual sense of being watched among the inmates, they would behave as the authorities wanted.

“When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his panopticon,” Mahmood told Blair. “That is that the eyes of the state can be on you at all times.”

Share.
Leave A Reply

Exit mobile version