Do you remember the Tom Cruise movie from 2002 named Minority Report, in which three very special people are connected via brain scanners to predict when someone is about to commit a serious crime like murder? Well as usual, Hollywood gets the big ideas right and the little details wrong. In the 2020s we now have very impressive artificial intelligence systems using computers, not “special people” that can look at billions of facts from millions of people, and make reasonable extrapolations about the future.
AI can quantify what a psychologist or a police detective or an experienced judge might have as a gut feeling. Artificial Intelligence computers can find the patterns humans can’t.
Predictive policing is the use of data analysis and machine learning to identify and prevent potential crimes. It is a controversial topic that has raised ethical and social concerns about the accuracy, fairness and accountability of the algorithms involved. Let’s explore how generative AI can be applied to predictive policing and what are some of the challenges and opportunities of this approach.
Generative AI is a branch of artificial intelligence that aims to create new data or content that resembles the original data or content. For example, generative AI can produce realistic images, texts, sounds or videos based on a given input or a latent space. Generative AI can be used for various purposes, such as entertainment, education, art, design or research.
One of the potential applications of generative AI is to enhance predictive policing by generating realistic scenarios or simulations of future crimes. For instance, generative AI can create synthetic data that reflects the patterns and trends of crime in a given area, based on historical data and other factors. This synthetic data can then be used to train predictive models that can forecast where and when crimes are likely to occur. Alternatively, generative AI can create virtual environments that mimic the real world and allow law enforcement agencies to test different strategies or interventions to prevent or reduce crime.
There are several benefits of using generative AI for predictive policing:
There are several serious challenges and risks of using generative AI for predictive policing:
Artificial intelligence is very good extrapolating what it already knows so if 40% of the prison population is black males today, AI is very likely to overemphasize the role of black males in future crimes. We see these biases in human police today and should expect nothing less from AI.
To put it bluntly we think generative AI just won’t get it right.
Our entire legal system is based on the premise that your innocence until proven guilty and that it is better to have 100 murderers go free than to have one innocent person go to jail or be executed. AI guidance and statistical likelihoods based on historical (which means flawed) data. Computer scientists simplify this down to the phrase”garbage in, garbage out”.
We think that Generative AI and Predictive Policing should be used more as a tool to rule people out than to rule people in. The problem is we have seen time and time again that the judicial system in general but police on the ground most of all, tend to focus nearly all of their attention on the first viable criminal candidate. This has led to countless miscarriages of justice and wrongful convictions all across the globe. If we’re going to allow generative AI to interact with police data bases we had best get the rules of engagement worked out in advance.
In conclusion, generative AI is a promising but complex technology that can have significant implications for predictive policing. It can offer new opportunities for improving the effectiveness and efficiency of crime prevention, but it can also raise new challenges and concerns for ensuring the fairness and accountability of the algorithms involved. Therefore, it is important to develop and implement generative AI for predictive policing with caution and care, taking into account the ethical and social values and norms of the stakeholders involved.
This website uses cookies.