We remember that 2002 movie by Stephen Spielberg, don’t we? Set in the year 2054, which is coming up faster than we want to think, his vision of a future world was, yes, cars did the driving themselves, then parked themselves, and traveled vertically up the sides of buildings. That was the cool part. The future almost seemed as though it might not be the total piece of trash we expect it to be.
The whole story hinges on a set of triplets, ‘pre-cogs’ that have the ability to see into the future and predict with near-certain accuracy when someone is going to commit a murder and who the victim(s) will be. These crimes are handled by the ‘pre-crime’ unit of some indistinguishable law enforcement agency. Their job is to track down the person(s) responsible before they have a chance to commit the crime. It’s an interesting plot that manages to go absolutely haywire. A sequel that follows the lives of the pre-cogs after they’re released from the system wasn’t nearly as interesting and largely failed at the box office.
The movie is still available to stream through Prime or Paramount+ if you’re interested.
At the end of the day, though, it’s just a movie, isn’t it? We’ve always thought so. The idea of pre-determining whether or when someone is going to commit a crime before they’ve even had the thought of doing so seems more than a little bit preposterous. There’s no way to have any form of due process because the entire system relies upon trusting the accuracy of the pre-cogs. People like the pre-cogs are a fantasy, and there would be no way to create a system without them.
Think again. This is real-life nightmare material we’re dealing with. The Guardian is reporting that the UK government is building a “murder prediction” program that uses personal data scraped from lord knows where to predict who is likely to commit murder. The system won’t be able to say when the murder(s) would occur, only that the subject studied has the prerequisite personality traits and tendencies of someone who would commit murder.
Data. You know, the stuff they get from observing what you’ve been posting on social media sites, especially those you think are secret. Data is always everywhere, involved in everything from where you shop, what you buy, and what you ate for dinner (which we know because you just had to take a picture of it). You’ve been warned for years that your personal data was going to end up someplace that would cause you trouble. Now, it has.
The scheme was originally called the “homicide prediction project”, but its name has been changed to “sharing data to improve risk assessment”. The Ministry of Justice hopes the project will help boost public safety, according to The Guardian. The existence of the project was discovered by the pressure group Statewatch, and some of its workings were uncovered through documents obtained by Freedom of Information requests.
Government officials familiar with the project insist that it only uses the data of people who have previously committed a crime. What crime? That doesn’t seem to matter. Too many parking tickets and you’re in the system. If you fail to keep your lawn mowed at an acceptable height and you’re in the system.
Statewatch contradicts that statement, though, saying that data from people not convicted of any criminal offense will be used as part of the project, including personal information about self-harm and details relating to domestic abuse. Now, think about the direction that the world is going. Who are you most likely to believe, the Government or the whistleblowers?
The project was commissioned by the government when Rishni Sunak was the prime minister back in 2015. They’ve had plenty of time to work on the algorithms and work out a couple of levels of inevitable kinks. One of the disturbing conditions is that the police would be in charge of determining which and how much data would be collected on an individual. This rings of the type of errors we see now as ICE agents are being allowed to determine who is in a gang and largely getting it wrong. If police can’t tell a gang sign from a sports tattoo, do we really want to trust them to guess who’s going to commit murder?
Sofia Lyall, a researcher for Statewatch, said: “The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems. Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.
“This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system. Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction, and disability is highly intrusive and alarming.”
The UK Government still insists that all this work is for “research purposes only.” Again, are we truly stupid enough to believe that line? At the moment, the system is using data from incarcerated persons or those recently incarcerated. Using the existing system as a baseline, the hope is to try and improve upon its risk assessment capabilities.
And if you think there’s a chance that the Brits are the only ones working on such a system, you really need to pay more attention to what’s going on in the world. We don’t need a special algorithm to tell us that the government, none of them, can be trusted with our personal information.
Discover more from Chronicle-Ledger-Tribune-Globe-Times-FreePress-News
Subscribe to get the latest posts sent to your email.