Getting your Trinity Audio player ready...
|
An algorithm has been developed by the UK’s Ministry of Justice to predict which criminals might later commit murder. The tool, originally named the “Homicide Prediction Project,” is called the “Sharing Data to Improve Risk Assessment.”
Led by the Ministry of Justice, Greater Manchester Police, Home Office and Metropolitan Police, the project started in January 2023 and, according to a timeline obtained by Statewatch via the Freedom of Information Act, was completed in December 2024 but is yet to be deployed.
It raises concerns about bias and the ethical implications of using such predictive models on vast datasets, including those already facing structural discrimination in the UK such as those within the white British ethnic group, particularly white men.
Related: The Sentencing Council has suspended its “two-tier” sentencing guidelines but anti-white, anti-male bias remains in UK’s judiciary and Lawfare, traditionally employed by the left, was used by the right to defeat two-tier sentencing guidelines
Let’s not lose touch…Your Government and Big Tech are actively trying to censor the information reported by The Exposé to serve their own needs. Subscribe now to make sure you receive the latest uncensored news in your inbox…
The following was originally published by The National Pulse.
The British Ministry of Justice is advancing in its initiative to create an algorithmic tool aimed at predicting which individuals convicted of crimes might escalate to committing homicide. Known internally as the Homicide Prediction Project, the undertaking emerged through Freedom of Information requests from the civil liberties group Statewatch, which flagged the project as concerning.
Expanding on risk-prediction systems already in place, the project is designed to build upon frameworks such as the Offender Assessment System (“OASys”), which has been used since 2001 to forecast recidivism and inform legal decisions. However, the broad scope of data in this new model has raised red flags. Data utilised, sourced from various police and government bodies, potentially includes information on up to half a million people, some without any criminal history.
Despite officials’ assertions that the project remains in a research phase, uncovered documents allude to future deployments. Sources claim increased collaboration across government agencies and police forces, such as Greater Manchester Police and the Metropolitan Police, to enhance the dataset driving these predictions.
Statewatch has raised ethical concerns about the predictive model’s potential for systemic bias. The British state has already attempted to introduce guidelines that were explicitly two-tier and would have seen ethnic minorities prioritised for bail over white men in the country.
Statewatch’s Sofia Lyall described the algorithm project as “chilling and dystopian,” calling for an immediate cessation of its development. “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed,” she said. She highlighted the risk algorithms pose in creating profiles of potential criminals before any crime is committed.
Read more: UK: Ministry of Justice secretly developing ‘murder prediction’ system, Statewatch, 8 April 2025
Related: UK: Over 1,300 people profiled daily by Ministry of Justice AI system to ‘predict’ re-offending risk, Statewatch, 9 April 2025
The Expose Urgently Needs Your Help…
Can you please help to keep the lights on with The Expose’s honest, reliable, powerful and truthful journalism?
Your Government & Big Tech organisations
try to silence & shut down The Expose.
So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.
The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.
Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.
Please choose your preferred method below to show your support.
Categories: Breaking News
This is tax payers money being wasted on this nonsense.
An awful lot of people need to lose their jobs.
it is spelt :heads
The people who funded and conceived the algorithm are the murderers in this World – classic inversion
[…] Go to Source Follow altnews.org on Telegram […]
Such a program isn’t that difficult to make and I can give you a program line that would be the core of any such program:
IF NAME$ = “Mohammed” OR NAME$ = “Hassan” OR NAME$ = “Ali” THEN Risk = Risk + 1000
Yes, it’s that simple!
I bet it didn’t predict that Matt Hancock would murder loads of old folks in care homes, even though he openly admitted to having all the syringe drivers at the ready and enough Midazolam to sink a battleship.
Yep, Hancock needs to be behind bars sharing a shower with Big Vern and the other ladies in the Nonces wing.
https://www.youtube.com/watch?v=VAGk2mvgBEk
What about sorting the cold cases first, in Rotherham, for a start.
Wasn’t there a movie with Tom Cruise where he was a law enforcement officer? A computer in it that predicted who would break the law.
In UK? What a joke. If the prediction were accurate it would be illlegal.
Minority report
Let’s start with the bleedin’ obvious shall we? 3rd world illegal immigrants and anyone who adheres to a certain desert religion for starters. Then add the WEF/Soros, WHO, Gates and the Clintons and that would be a good start.
[…] UK Ministry of Justice has developed a computer program to predict who might commit murder An algorithm has been developed by the UK’s Ministry of Justice to predict which criminals might later commit murder. The tool, originally named the “Homicide Prediction Project,” is called the “Sharing Data to Improve Risk Assessment.”…It raises concerns about bias and the ethical implications of using such predictive models on vast datasets, including those already facing structural discrimination in the UK such as those within the white British ethnic group, particularly white men. […]
I am from 2 non European countries; People looked up to the West for its apprent liberty, free press et c, but more and more clearly, a lot of this was not so true.Now the West is speeding towards a nightmarish dstopia. When the heck are people going to wake up and take care of the sold, blackamiled, corrupt judges, politicians etc ??
[…] by Rhoda Wilson, Expose News: […]
Automated racist profiling. Will the computer call it hate speech if you call the computer racist?
The results would be phenomenal if it were applied to top uk politicians, who send soldiers to murder and be murdered for starters.