UK News

Essex Police Pause Facial Recognition, But Why Was It Rolled Out at All?

Please share our story!

Essex Police has paused its use of live facial recognition after a Cambridge study found the system was statistically more likely to correctly identify black people than other ethnic groups, and more likely to identify men than women. The force also flaunted worrying figures: around 1.3 million faces were scanned between August 2024 and February 2025, producing 48 arrests and “only one mistaken intervention.” That is being presented as reassurance. It should be read as evidence of how quickly mass biometric monitoring is being normalised in Britain. Millions of people were scanned in public before the public had any clear answer to a basic question: what level of proof was ever produced to justify deploying this in the first place?

There's No Evidence Facial Recognition Reduces Crime, So What's It For?
Theres No Evidence Facial Recognition Reduces Crime So Whats It For

Police Admits Facial Recognition Does Not Reduce Crime

Essex’s own commissioned report also found “there was no statistically significant evidence that LFR (live facial recognition) deployments reduced crime” and that “crime levels were similar before, during and after deployments”. The report said the main impact appeared to be the identification of specific individuals, rather than any measurable deterrent effect on offending in the surrounding area.

That is awkward for a technology so often sold as a public-safety necessity. If the broader deterrent effect cannot be demonstrated, then the justification narrows considerably. The public is left with a system that scans enormous numbers of people, produces a modest number of interventions, and has not shown clear evidence of making the wider area safer. That is a much thinner case than ministers and police press releases usually imply.

It’s Not Just Essex: Facial Recognition is Rolling Out Nationwide

By the end of 2025, thirteen police forces in England and Wales were using live facial recognition, according to Sky News. In January, the government said the number of facial-recognition vans would increase from 10 to 50. The Home Office has also cited more than 1,300 arrests in London between January 2024 and September 2025 linked to the technology. These figures are often presented as evidence of success, but they are also evidence of quiet expansion.

This is why the Essex pause should not be mistaken for restraint. National policy is moving in the opposite direction. The state is not stepping back from facial recognition; it is embedding it more deeply. Once that infrastructure becomes ordinary, opposition is pushed onto narrower ground: not whether the public should be scanned, but only whether the scanning is sufficiently balanced, proportionate, and accurate. That is a major shift in what citizens are expected to tolerate.

Accuracy Figures Are Troubling, Not Reassuring

For anyone already uneasy about facial recognition, the phrase “only one mistaken intervention” does not sound reassuring. It suggests a model of surveillance that is smooth enough to sustain itself politically. Essex’s own figures show around 1.3 million faces were scanned over 41 deployments, leading to 123 interventions and 48 arrests by police. That is one arrest for roughly every 27,000 faces scanned. The ratio tells its own story: a large public is subjected to biometric scrutiny so that a small number of targets can be located.

There is also a wider record here. NIST has previously found that facial-recognition algorithms can show substantial demographic differentials, with false positives often varying by factors of 10 to more than 100 across groups in some contexts. It reported higher false-positive rates for women, African Americans, and particularly African American women in one-to-many search systems, precisely the sort of setting that raises the risk of false accusation or added surveillance. Essex’s findings do not emerge in a vacuum. They sit within a longer history of uneven performance and civil-liberties concern.

Final Thought

Essex police has paused live facial recognition because the bias issue became exposed, not because the underlying power is too intrusive. Yet the same report that flagged fairness concerns also found no statistically significant short-term crime reduction. If facial recognition scans millions, delivers limited arrests, and cannot show it actually reduces crime, what exactly are we being asked to accept it for?

Your Government & Big Tech organisations
try to silence & shut down The Expose.

So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.

The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.

Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.

Please choose your preferred method below to show your support.

Stay Updated!

Stay connected with News updates by Email

Loading


Please share our story!
author avatar
g.calder
I’m George Calder — a lifelong truth-seeker, data enthusiast, and unapologetic question-asker. I’ve spent the better part of two decades digging through documents, decoding statistics, and challenging narratives that don’t hold up under scrutiny. My writing isn’t about opinion — it’s about evidence, logic, and clarity. If it can’t be backed up, it doesn’t belong in the story. Before joining Expose News, I worked in academic research and policy analysis, which taught me one thing: the truth is rarely loud, but it’s always there — if you know where to look. I write because the public deserves more than headlines. You deserve context, transparency, and the freedom to think critically. Whether I’m unpacking a government report, analysing medical data, or exposing media bias, my goal is simple: cut through the noise and deliver the facts. When I’m not writing, you’ll find me hiking, reading obscure history books, or experimenting with recipes that never quite turn out right.
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments