AI fails to predict crime. But researchers will not stop trying it.

Comment

In the world of the 2002 film “Minority Report”, crime is almost non-existent. Clairvoyant predicts when murder is about to happen, allowing police to crack down and arrest the soon-to-be criminals.

Although Tom Cruise’s almighty police force is evidence of a dystopian society, scientists have long chased the tempting prospect of being able to predict crime before it happens.

And as the United States faces rising rates of violent crime, another research project emerged: A group of researchers from the University of Chicago unveiled an algorithm last month that boasted in a press release of its ability to predict crime with “90% accuracy. “

The algorithm identifies locations in major cities that it calculates have a high probability of crimes, such as homicides and burglaries, taking place in the next week. The software can also evaluate how police work varies across neighborhoods in eight major cities in the United States, including Chicago, Los Angeles and Philadelphia.

But using artificial intelligence to lead law enforcement rings alarm bells for many researchers and criminologists in the field of social justice, who cite a long history of such technology unfairly suggests increased policing of black and Latino people. Even one of the study’s authors acknowledges that the ability of an algorithm to predict crime is limited.

“The past does not tell you anything about the future,” said Ishanu Chattopadhyay, a professor at the University of Chicago and a leading researcher in the algorithm. “The question is: To what extent does the past actually affect the future? And to what extent are events spontaneous or truly random? … Our ability to predict is limited by that.”

The Google engineer who thinks the company’s AI has come to life

Police have long used all available tools to predict crime. Before the technological advances, the police stayed together in conference rooms and put pins of crime incidents on a map in the hope that the clusters would help them figure out where to look next.

Over the past 15 years, the country’s largest police departments – such as New York, Los Angeles and Chicago – have begun to think of ways to use artificial intelligence to not only analyze crime, but predict it. They often turned to data analytics companies like PredPol and Palantir, which create software that law enforcement can use to predict crime.

Predictive policing tools are built by passing data – such as crime reports, arrest records and license plate images – to an algorithm trained to look for patterns to predict where and when a particular type of crime will occur in the future.

But algorithms are only as good as the data they are fed, which is a problem, especially for people in the United States, said Vincent Southerland, co-faculty director of the New York University Center for Race, Inequality and Law.

Historically, police data in the United States is biased, according to Southerland. Police are more likely to arrest or charge someone for a crime in low-income neighborhoods dominated by colored people, a reality that does not necessarily reflect where crime takes place, but where police spend their time.

This means that most data sets of criminal activity over-represent colored people and low-income neighborhoods. Feeding this data into an algorithm makes it suggest that more criminal activity is in these areas, creating a feedback loop that is racially and socioeconomically biased, Southerland added.

“You have data that is infected by or spotted by some bias – and that bias will show up on the other side of the analysis,” he said. “You get out of it, what you put into it.”

The war inside Palantir: The computer company’s ties to ICE under attack by employees

In the real world, predictive software has caused quite a stir.

In 2019, the Los Angeles Police Department suspended its crime prediction program, LASER, which used historical crime data to predict crime hotspots and Palantir software to award people criminal risk scores, after an internal audit showed it led to police unfairly exposing black and Latino people for more monitoring.

In Chicago, police used predictive police software from the Illinois Institute of Technology to make a list of people most likely involved in a violent crime. A study by RAND and a subsequent study by the Chicago Sun-Times showed that the software included every single person arrested or fingerprinted in Chicago since 2013 on the list. The program was scrapped in 2020.

Predictive police algorithms are “not a crystal ball,” said John S. Hollywood, a senior operations researcher at RAND who helped revise the Chicago Police Department’s use of predictive algorithms. “It’s better to look more holistically … what’s going on in terms of specific things in my society that are leading to crime right now.”

Chattopadhyay said his team’s software was made with knowledge of the algorithms’ troubled past.

By making the algorithm, Chattopadhyay’s team segmented major cities into 1,000 square feet of urban blocks and used urban crime data from the last three to five years to train them. The algorithm spits out whether there is a high or low risk of crime occurring in a segment at any given time, up to a week into the future.

To limit bias, the team omitted crime data such as marijuana arrests, traffic jams or petty crimes at low levels because research shows that black and Latino people are more often targeted for these types of crimes. Instead, they provided the algorithm with data on homicides, assaults and batteries, along with property crimes such as burglaries and motor vehicle thefts.

But the main point of the investigation, he said, was to use the algorithm to interrogate how police are biased. His team compared arrest data from neighborhoods at different socioeconomic levels. They found that crime that took place in richer areas led to more arrests, whereas crime in poorer neighborhoods did not always have the same effect, indicating a discrepancy in enforcement.

Chattopadhyay said these findings help provide evidence for people who complain that law enforcement ignores poorer neighborhoods when there is an increase in violent crime or property crime. “This allows you to quantify it,” he said. “To show the evidence.”

Arvind Narayanan, a professor of computer science at Princeton University, said the study’s press release and news articles about it did not focus enough on the study’s attempts to investigate police crime biases and overemphasized the algorithms’ claims of accuracy.

“For predictive policing, a single number for accuracy … is completely insufficient to assess whether a tool is useful or fair,” he said. “Crime is rare, so it’s likely that most crime predictions are false positives.”

There is racial bias in our police systems. Here is the proof.

Forensic scientists, police experts and technologists note that even if an algorithm is accurate, it can still be used by law enforcement to target people of color and those living in poorer neighborhoods for unwarranted surveillance and surveillance.

Andrew Papachristos, a sociology professor at Northwestern University, said that when law enforcement uses algorithms to map and analyze crime, it often exposes people of color and low-income communities to more policing. When criticized for over-policing in certain neighborhoods, they often use data to justify the tactics he said.

Papachristos said that if community groups could use the tools instead of figuring out where to provide more social services, increase community engagement and address the root social causes of violence, it would be a better use of technology. Although, he said, it is unlikely to happen because organizations doing this work are money-saving and skeptical about using data.

‘They have seen data misused against them in court. They have seen it to use to profile individuals, ”he said. “So if someone rolls up like me and says, ‘Hi, we want to help you use data.’ It’s not an instant like, ‘Oh my God, thank you.’ It’s like, ‘What data are you using?’ ”

Hollywood, from RAND Corporation, agreed. He said that in order to really reduce crime, police departments need to work with social workers and community groups to address issues of education, housing and citizen involvement.

“[Algorithms] is a bright, shiny object, ”he said. “These things tend to be distractions.”

Leave a Comment