Breaking News

UK Ministry of Justice has developed a computer programme to predict who might commit murder

Getting your Trinity Audio player ready...
Please share our story!


An algorithm has been developed by the UK’s Ministry of Justice to predict which criminals might later commit murder.   The tool, originally named the “Homicide Prediction Project,” is called the “Sharing Data to Improve Risk Assessment.”

Led by the Ministry of Justice, Greater Manchester Police, Home Office and Metropolitan Police, the project started in January 2023 and, according to a timeline obtained by Statewatch via the Freedom of Information Act, was completed in December 2024 but is yet to be deployed.

It raises concerns about bias and the ethical implications of using such predictive models on vast datasets, including those already facing structural discrimination in the UK such as those within the white British ethnic group, particularly white men.

Related: The Sentencing Council has suspended its “two-tier” sentencing guidelines but anti-white, anti-male bias remains in UK’s judiciary and Lawfare, traditionally employed by the left, was used by the right to defeat two-tier sentencing guidelines

Let’s not lose touch…Your Government and Big Tech are actively trying to censor the information reported by The Exposé to serve their own needs. Subscribe now to make sure you receive the latest uncensored news in your inbox…

Stay Updated!

Stay connected with News updates by Email

Loading


The following was originally published by The National Pulse.

The British Ministry of Justice is advancing in its initiative to create an algorithmic tool aimed at predicting which individuals convicted of crimes might escalate to committing homicide. Known internally as the Homicide Prediction Project, the undertaking emerged through Freedom of Information requests from the civil liberties group Statewatch, which flagged the project as concerning.

Expanding on risk-prediction systems already in place, the project is designed to build upon frameworks such as the Offender Assessment System (“OASys”), which has been used since 2001 to forecast recidivism and inform legal decisions. However, the broad scope of data in this new model has raised red flags. Data utilised, sourced from various police and government bodies, potentially includes information on up to half a million people, some without any criminal history.

Despite officials’ assertions that the project remains in a research phase, uncovered documents allude to future deployments. Sources claim increased collaboration across government agencies and police forces, such as Greater Manchester Police and the Metropolitan Police, to enhance the dataset driving these predictions.

Statewatch has raised ethical concerns about the predictive model’s potential for systemic bias. The British state has already attempted to introduce guidelines that were explicitly two-tier and would have seen ethnic minorities prioritised for bail over white men in the country.

Statewatch’s Sofia Lyall described the algorithm project as “chilling and dystopian,” calling for an immediate cessation of its development. “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed,” she said. She highlighted the risk algorithms pose in creating profiles of potential criminals before any crime is committed.

Read more: UK: Ministry of Justice secretly developing ‘murder prediction’ system, Statewatch, 8 April 2025

Related: UK: Over 1,300 people profiled daily by Ministry of Justice AI system to ‘predict’ re-offending risk, Statewatch, 9 April 2025

Your Government & Big Tech organisations
try to silence & shut down The Expose.

So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.

The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.

Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.

Please choose your preferred method below to show your support.

Stay Updated!

Stay connected with News updates by Email

Loading


Please share our story!
author avatar
Rhoda Wilson
While previously it was a hobby culminating in writing articles for Wikipedia (until things made a drastic and undeniable turn in 2020) and a few books for private consumption, since March 2020 I have become a full-time researcher and writer in reaction to the global takeover that came into full view with the introduction of covid-19. For most of my life, I have tried to raise awareness that a small group of people planned to take over the world for their own benefit. There was no way I was going to sit back quietly and simply let them do it once they made their final move.

Categories: Breaking News

Tagged as:

5 1 vote
Article Rating
Subscribe
Notify of
guest
17 Comments
Inline Feedbacks
View all comments
daisy
daisy
4 months ago

This is tax payers money being wasted on this nonsense.
An awful lot of people need to lose their jobs.

aida
aida
Reply to  daisy
4 months ago

it is spelt :heads

Marcus C Martin
Marcus C Martin
4 months ago

The people who funded and conceived the algorithm are the murderers in this World – classic inversion

Petra
Petra
4 months ago

Such a program isn’t that difficult to make and I can give you a program line that would be the core of any such program:

IF NAME$ = “Mohammed” OR NAME$ = “Hassan” OR NAME$ = “Ali” THEN Risk = Risk + 1000

Yes, it’s that simple!

Ian T
Ian T
4 months ago

I bet it didn’t predict that Matt Hancock would murder loads of old folks in care homes, even though he openly admitted to having all the syringe drivers at the ready and enough Midazolam to sink a battleship.

Michael Bolton
Michael Bolton
Reply to  Ian T
4 months ago

Yep, Hancock needs to be behind bars sharing a shower with Big Vern and the other ladies in the Nonces wing.

Dave Owen
Dave Owen
4 months ago

https://www.youtube.com/watch?v=VAGk2mvgBEk
What about sorting the cold cases first, in Rotherham, for a start.

Lynette Devries
Lynette Devries
4 months ago

Wasn’t there a movie with Tom Cruise where he was a law enforcement officer? A computer in it that predicted who would break the law.

Plebney
Plebney
4 months ago

In UK? What a joke. If the prediction were accurate it would be illlegal.

vaboon
vaboon
Reply to  Plebney
4 months ago

Minority report

Michael Bolton
Michael Bolton
4 months ago

Let’s start with the bleedin’ obvious shall we? 3rd world illegal immigrants and anyone who adheres to a certain desert religion for starters. Then add the WEF/Soros, WHO, Gates and the Clintons and that would be a good start.

trackback
4 months ago

[…] UK Ministry of Justice has developed a computer program to predict who might commit murder An algorithm has been developed by the UK’s Ministry of Justice to predict which criminals might later commit murder. The tool, originally named the “Homicide Prediction Project,” is called the “Sharing Data to Improve Risk Assessment.”…It raises concerns about bias and the ethical implications of using such predictive models on vast datasets, including those already facing structural discrimination in the UK such as those within the white British ethnic group, particularly white men. […]

aida
aida
4 months ago

I am from 2 non European countries; People looked up to the West for its apprent liberty, free press et c, but more and more clearly, a lot of this was not so true.Now the West is speeding towards a nightmarish dstopia. When the heck are people going to wake up and take care of the sold, blackamiled, corrupt judges, politicians etc ??

Anon
Anon
4 months ago

Automated racist profiling. Will the computer call it hate speech if you call the computer racist?

SuziAlkamyst
SuziAlkamyst
4 months ago

The results would be phenomenal if it were applied to top uk politicians, who send soldiers to murder and be murdered for starters.