This is the final part of a three-part series by Flint Beat exploring each of the city’s elements of their plan to combat gun violence. In each installment of the series, Flint Beat will take an in-depth, expert- and data-driven analysis to see what residents can expect as Flint police carry out their plan. See our first installment on Flint’s gun buyback program, here.

Flint, MI–The city of Flint has been implementing a controversial policing practice that some say is proactive and efficient, and others say perpetuates racial biases.

It’s called “predictive policing.”

On July 21, Mayor Sheldon Neeley and Former Police Chief Phil Hart announced a three-part plan to fight crime. One component of the plan is the formation of a special investigative unit charged with finding and seizing illegal guns off the street. 

Flint’s new Police Chief Terence Green said the Crime Analysis Unit at the department provides weekly crime statistics to find trends and locate “hot spots” using NC4 policing software, a public safety data collection program.

“I would describe it as evidence-based and data-driven, so data suggests we should be operating in these areas,” Green said. “You could say it’s predictive policing.”

The idea of predictive policing is to use existing data about crime history and calls for service, to locate and police “hot spots.”

“For example, the south district we’ve seen an uptick in violent crime in that area based on crime statistics, so we’ll deploy the SIU to that area for a period of time and they’re conducting proactive work in that area,” Green said. 

That proactive work, Green said, consists of “initiating traffic stops with the entire goal of taking illegal guns off the streets.”

Green said the current members of the SIU were already members of the Flint Police Department and some of them have tactical expertise in seizing illegal weapons.

“The unit has been very successful. Last month, the month of August, that unit seized 64 illegal firearms,” Green said. “It’s a very productive unit.”

From July 25 to Aug. 15, the SIU has arrested 41 people and impounded 30 vehicles. They also recovered 50 grams of crack, 73 grams of cocaine, 25 grams of heroin, 15 grams of ecstasy and 50 grams of suspected crystal meth. 

In 2016, a similar unit was announced under former Mayor Karen Weaver and former Police Chief Tim Johnson: the CATT Squad. Standing for Crime Area Target Team, it was made up of officers who policed areas of high crime.

According to a press release from the City of Flint in 2017, “the CATT Squad was instrumental in helping to get more than 100 guns off the streets and more than $100,000 in cash.”

Green said the CATT Squad, which is no longer in operation, was similar to the SIU, but had a “smorgasbord of different assignments,” whereas the SIU is mainly focused on seizing illegal weapons.

Proactive enforcement was the idea behind both programs.

Supporters of predictive policing feel that this method allows police to work more efficiently– if there are areas that have more crime than other areas, police should be spending more time there.

People who oppose predictive policing say it can become something of a feedback loop. Areas that have “high crime” are policed more heavily, resulting in more arrests which feed back into the system and tell police to keep policing that area. 

The Kansas City Police Department in the 1990s used similar tactics to the SIU to find and seize illegal guns. According to a study by the National Institute of Justice from 1995, the department was aiming to “reduce gun violence, drive-by shootings, and homicides in a patrol beat where the homicide rate was 20 times higher than the national average.”

They put more patrols out in this 80-block “hotspot” that was determined by a computer analysis of all of the gun crimes in the city. 

Police officers found and seized illegal guns by frisking people once arrested and searching people’s vehicles during traffic stops. 

According to the study of this program, “gun seizures by police in the target area increased by more than 65 percent, while gun crimes declined in the target area by 49 percent.”

Researchers also found that there was “an average of 1 gun found in every 28 traffic stops,” making it the “most productive method of finding guns.”

But recent research of traffic stops by Kansas City police has found race to be a big factor.

Three professors from the University of Kansas started surveying drivers in the city in 2004 and continued their research for ten years. They published their findings in a book called  Pulled Over: How Police Stops Define Race and Citizenship

According to an NPR article about the book and research, the authors found two categories of traffic stops which they call “traffic safety stops” and “investigatory stops.” 

Safety stops were for “obvious violations of traffic law” and investigatory stops were for “small technical violations, which led to longer conversations with probing questions.”

The article stated that researches found no racial disparity in traffic safety stops, but that Black people were “2.7 times more likely to be pulled over in an investigatory stop” and five times more likely to be subjected to a search than white drivers. 

Green said racial profiling with the SIU should not be a concern. 

“They’re making a lawful traffic stop, racial profiling can’t even be considered due to the fact that these are experienced officers,” Green said. 

He said traffic stops are often for various violations like blowing through stop signs or speeding.

“I’m not sure in every instance what leads to searching the vehicle, but they have to get consent from the driver, or make some type of arrest that leads to seizing firearms,” Green said. 

He also said traffic stops aren’t the only ways officers seize guns. He said calls about suspicious activity and things of that nature can also lead to seizure of illegal guns. 

Traffic stops aside, different social justice organizations take issue with the concept of predictive policing altogether. 

According to a journal article from the National Criminal Justice Reference Service, Richmond, VA had been experiencing an increase in random gunfire every new year until 2003, when the police department used “predictive policing” strategies to place officers at locations that had previously been an issue. 

“The result was a 47 percent decrease in random gunfire and a 246 percent increase in weapons seized,” author Beth Pearsall wrote. “The department saved $15,000 in personnel costs.”

According to a Government Technology article, the Richmond Police Department saw a 21% decrease in crime from 2005 to 2006 due to these strategies. 

But Assistant Professor Eli Coston of Virginia Commonwealth University said that this might not be due to predictive policing. 

“There was a significant budget increase and the department put more cops on the beat,” Coston said. “There has been an overall trend in decline in crime in Richmond but there have been, when we look nationally even, overall declines in crime. Attributing those things only to policing efforts is a little bit disingenuous.”

The Richmond Police Department recently entered a contract with SOMA Global, a public safety solutions company, last year to implement a new records management system in Richmond which could potentially be used for predictive policing. 

Coston is a part of the Richmond Transparency and Accountability Project that is working to keep their police department from using the predictive policing software SOMA Global provides.

“Have police officers always relied on historical crime trends or hot spots? Sure, but then there was an actual individual responsible that you could hold accountable versus ‘a computer told us to go there’,” Coston said.

It’s particularly hard to hold a computer accountable because the data and algorithms aren’t always perfect.

Some studies show that predictive computer models can perpetuate existing racial biases and inaccurately determine a person or place’s level of risk. 

A study published by ProPublica found that one predictive modeling software was giving Black people higher levels of risk for re-offending than white people convicted of the same crime. 

“In forecasting who would re-offend, the algorithm made mistakes with Black and white defendants at roughly the same rate but in very different ways,” authors of the study wrote. “The formula was particularly likely to falsely flag Black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. White defendants were mislabeled as low risk more often than Black defendants.”

Last year, the ACLU of Virginia wrote a letter to the Richmond mayor urging him to ban the use of predictive policing. 

“Predictive policing has been established on a national platform to be ineffective, lead to over-policing marginalized communities, and ignore community and human needs,” the ACLU wrote. “Today, ‘predictive policing’ tools are used primarily to further target communities that are already over-policed.”

The Los Angeles Police Department and the Chicago Police Department both recently stopped using predictive policing methods.

There were groups in each city that protested the use of predictive policing by their department on the grounds that they were disproportionately targeting Black and Latino communities. 

“Predictive policing uses past arrests, past calls for service, all kinds of crime data pulled from records management systems to predict where crime will occur in future, and deploy more officers there,” Coston said. “Obviously what we see with most algorithms is they’re based on prior crime data which we know across the United States is biased in terms of higher predictions of crime occurring in racial minority neighborhoods, lower class neighborhoods, even if that’s not the case.”

According to the NAACP Criminal Justice Fact Sheet, “African Americans and whites use drugs at similar rates, but the imprisonment rate of African Americans for drug charges is almost 6 times that of whites.”

Authors of a study of racial disparities in arrests found that Black participants were arrested seven times more frequently than white participants, and “neither contextual nor behavioral differences account for the arrest disparity.”

Predictive models use arrest histories to determine levels of risk for people. If the criminal justice system disproportionately arrests Black people, predictive models run the risk of disproportionately labeling Black people as more “risky” than white people. 

Additionally, if more arrests take place in one area because of existing racial biases, predictive models could tell police to further police those areas because they are “high crime,” creating a cycle that perpetuates racism. 

“We need to be investing in preventative services, not just sending more police into neighborhoods where people are then criminalized for loitering and vagrancy,” Coston said. “That’s actually the effect of predictive policing…it increases criminalization of low level crimes.”

Coston said there is not a way to do predictive policing without the influences of racial biases.

“Our entire system of policing is so inherently racially biased already. Even if there are individuals making decisions that we can hold accountable, we know there are already racial biases that they have,” Coston said. “Then the anonymity of the computer system adds another layer of protection of police to be able to say, ‘hey, we’re not engaging in bias.'”

Investing in social programs to address the underlying causes of crime is what Coston said is the best alternative to predictive policing.

“The fact of the matter is many of the issues that predictive policing tries to solve are really about things like drugs or gun violence,” Coston said. “We’re not going to address root causes of crime by increasing the number of police. We need to have programs that address education and poverty and drug addiction and mental health.”

Amy Diaz is a journalist hailing from St. Petersburg, FL. She has written for multiple local newspapers in her hometown before becoming a full-time reporter for Flint Beat. When she’s not writing you...