Did You Know?

Facial Recognition Wrongly Jailed Woman for Crimes in a State She’d Never Even Visited

Please share our story!

Angela Lipps, a 50-year-old woman from Tennessee, says she spent months in jail over fraud offences in North Dakota, a state she says she had never visited, after police used facial recognition to identify her from surveillance footage. According to reports, she was arrested, extradited more than 1,000 miles away, and left to rely on ordinary records to establish that she could not have committed the offences in question. The charges were later dismissed after her lawyer produced bank records showing she was in Tennessee at the time.

The case is striking on its own facts, but it also lands at a moment when the public case for facial recognition already looks weak. In the UK, Cambridge-led research commissioned by Essex Police found “no statistically significant evidence” that live facial recognition reduced crime in the short term. That leaves an uncomfortable combination: a technology that intrudes deeply into personal privacy, can produce life-altering errors, and still struggles to demonstrate the broader crime-reduction benefits used to justify its spread.

Angela Lipps Wrongly Imprisoned After Facial Recognition Error
Angela Lipps Wrongly Imprisoned After Facial Recognition Error

How Facial Recognition Wrongly Led to Jail Time for Tennessee Grandmother

According to reporting, Lipps was arrested in July 2025 while babysitting her grandchildren after investigators in Fargo used facial recognition in a bank fraud case and concluded she matched the suspect. She says she had never been to North Dakota. She was jailed in Tennessee for nearly four months before extradition, and was ultimately released after her lawyer produced records showing she was more than 1,200 miles away when the crimes were committed.

Investigators didn’t just make the one error by trusting facial recognition results. Alarmingly, they also compared the suspect with Lipps’ social media images and driver’s licence photo, which is often presented as a kind of safeguard, but in practice it did not prevent arrest, detention, extradition, or the collapse of an ordinary life. Lipps says she lost her home, her car, and her dog. Fargo’s police chief later said procedures would be reviewed.

A software-assisted identification, followed by human confirmation, was treated as strong enough to take away a person’s liberty for months. The exculpatory evidence turned out to be routine bank records. It does not speak well of the follow-up process, but ultimately it was the initial facial detection software that led to her arrest.

It’s Happening More Than You Think

Lipps’s case is not a freak event emerging from nowhere. The ACLU said in 2024 that there had already been at least seven known wrongful arrests in the United States tied to police reliance on incorrect facial-recognition results. The organisation argued that simple warnings about not relying solely on face recognition were not preventing these harms in practice.

The technical concerns are also longstanding. NIST’s Demographic Effects in Face Recognition report says false positives differ by age, sex, and race. Earlier NIST evaluations found that many one-to-many matching systems showed substantially higher false-positive rates for some demographic groups than others.

That is one reason official reassurances have limited force. Facial recognition is often described as merely an investigative lead, with the final judgment left to human officers. Cases like this show how little protection that distinction may offer in practice. Once a facial match is treated as sufficiently credible to justify arrest and detention, the burden quickly falls on the person identified to prove the system was wrong.

Recent UK News Further Discredits Facial Recognition Policing

In our recent piece, “Essex Police Pause Facial Recognition, But Why Was It Rolled Out at All?“, we outlined a UK law enforcement decision to pause the use of software because of its skewed accuracy figures. Cambridge-led research also determined there is “no statistically significant evidence that LFR deployments reduced crime”, bringing the official narrative into question. Crime levels in deployment areas did not change before, during, or after launching facial recognition operations, and instead risks infringing on the privacy of innocent people without any evidence to justify it.

Despite those UK findings and cases such as Lipps’ false imprisonment, the public continues to be expected to accept mass biometric surveillance because it “improves safety”. Yet the ACLU findings and Cambridge’s research both fail to demonstrate any significant reduction in crime. Meanwhile, real-world failures continue to show how high the cost of error can be, and millions of people are unnecessarily tracked without their permission.

In Essex police’s case, which broadly favoured its operational accuracy, it was acknowledged that live facial recognition scans very large numbers of people to identify a tiny number of suspects, and said decisions about its use should consider proportionality, transparency, and proper oversight.

What Happened to the Wrongly Imprisoned Angela Lipps?

After being arrested by US Marshals in July 2025 at her home in Tennessee, Lipps was booked into county jail as a fugitive from justice from North Dakota. She told local news outlets “I’ve never been to North Dakota, I don’t know anyone from North Dakota”, adding “it was so scary. I can still see it in my head, over and over again.”

The grandmother explained how she was detained and later found that she had been arrested on charges including four counts of unauthorised use of personal identifying information and four counts of theft in North Dakota. Police in Fargo had been investigating a string of bank fraud causes between April and May 2025, leading to Lipps’ identification by AI software.

Lipps sat in county jail in Tennessee for four months without bail and was unable to plead her case until she could be extradited to North Dakota. Once there, she finally retained a lawyer and was interviewed by police for the first time on 19 December. Her attorney proved via her own bank records that she had been in Tennessee at the time the crimes were committed in North Dakota.

Her attorney said “the investigation and arrest of Angela relied solely on facial recognition. The Fargo Police Department did not contact Angela Lipps until I provided them her bank records and arranged an interview with her.” Five days after she was finally interviewed, she was dismissed and released from jail. However, she was given no expenses from the police to help her return home and relied on funds from defense attorneys to get a hotel and eventually find transportation back to Tennessee.

In a statement to People, Fargo Police Chief Dave Zibolski said, “The issuance of an arrest warrant for Ms. Lipps indicates that a court determined probable cause existed for the charges. While the charges were later dismissed without prejudice, that procedural step simply means the charges may be re-filed if additional investigation supports doing so. The Fargo Police Department continues to actively investigate this matter and continues to follow the criminal justice process.”

Zibolski added: “The investigation remains ongoing with respect to all individuals involved. Because the case is still open and active, I am not providing additional comment at this time to avoid compromising the investigation.”

Final Thought

Facial recognition is sold as a rational compromise between privacy and safety. Cases like this, and the evidence from Essex, suggest the compromise is much harder to defend than its advocates imply. A woman can lose months of her life after being linked by facial recognition to crimes in a state she says she had never entered, while a police-commissioned study still cannot show statistically significant short-term crime reduction from deploying the technology. They go to the centre of the argument; they are not just side issues.

If the benefits remain so uncertain and the failures so serious, expansion starts to look less like prudent policing and more like the normalisation of an intrusive power that has not earned public trust.

Your Government & Big Tech organisations
try to silence & shut down The Expose.

So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.

The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.

Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.

Please choose your preferred method below to show your support.

Stay Updated!

Stay connected with News updates by Email

Loading


Please share our story!
author avatar
g.calder
I’m George Calder — a lifelong truth-seeker, data enthusiast, and unapologetic question-asker. I’ve spent the better part of two decades digging through documents, decoding statistics, and challenging narratives that don’t hold up under scrutiny. My writing isn’t about opinion — it’s about evidence, logic, and clarity. If it can’t be backed up, it doesn’t belong in the story. Before joining Expose News, I worked in academic research and policy analysis, which taught me one thing: the truth is rarely loud, but it’s always there — if you know where to look. I write because the public deserves more than headlines. You deserve context, transparency, and the freedom to think critically. Whether I’m unpacking a government report, analysing medical data, or exposing media bias, my goal is simple: cut through the noise and deliver the facts. When I’m not writing, you’ll find me hiking, reading obscure history books, or experimenting with recipes that never quite turn out right.
0 0 votes
Article Rating
Subscribe
Notify of
guest
1 Comment
Inline Feedbacks
View all comments
Mark Brody
Mark Brody
1 hour ago

So, I see, the facial recognition software was identified as the culprit! I wonder whether the judge took into account human evidence to reach this decision, or only computer-based evidence.