(CNSNews.com) -- A coalition of 17 civil rights and technology groups is claiming that “predictive policing” tools currently in use in at least 20 large police departments across the U.S. rely on biased data that “supercharges discrimination” in minority communities while not making them any safer.
Funded with state and federal grants, computerized predictive policing programs like PredPol and HunchLab – often referred to as “cops on dots” - use historical crime data to identify “hot spots” and forecast where criminal activity is most likely to occur in the future so that police departments can proactively intervene.
CompStat, introduced in 1994 by New York Police Commissioner Bill Bratton, uses predictive policing techniques to pinpoint potential criminal activity. In April, Bratton credited CompStat for a 75 percent reduction in crime in the city.
A 2013 National Institute of Justice (NIJ) report explains that “at the core of this process is a four-step cycle. The first two steps involve collecting and analyzing crime, incident, and offender data in order to develop predictions. The third step consists of conducting police operations that intervene to prevent predicted threats to public safety…The fourth step involves the implementation of interventions.”
NIJ is sponsoring a Real-Time Crime Forecasting Challenge to students and businesses to predict four types of crime (burglary, street crime, auto theft and all calls for service) that will occur in Portland, Oregon between March 1, 2017 and May 31, 2017. A total of $1.2 million in prizes will be awarded to contestants who most accurately predict the type and number of actual crimes that are committed during that period.
But these “computer-driven hunches… are used primarily to further concentrate enforcement activities in communities that are already over-policed, rather than to meet human needs, ” the coalition stated in an Aug. 31 joint statement.
“Most predictive policing systems fielded today focus narrowly on the reported crime rate. Other vital goals of policing, such as building community trust, eliminating the use of excessive force, and reducing other coercive tactics, are currently not measured and not accounted for by these systems,” the statement continued.
“As a result, current systems are blind to their impact in these areas, and may do unnoticed harm.”
“This is fortune-teller policing that uses deeply flawed and biased data and relies on vendors that shroud their products in secrecy. Instead of correcting dysfunctional law enforcement practices, these products exacerbate the most profound flaws in our criminal justice system,” Wade Henderson, president and CEO of The Leadership Conference on Civil and Human Rights, told reporters during a conference call last week.
“They super-charge discrimination, profiling, and the over-policing of certain communities, particularly communities of color. These technologies threaten the Constitution’s promises of equal protection under the law and due process, and its protections against unreasonable searches and seizures.
“To me, it evokes the 2002 Tom Cruise sci-fi thriller, Minority Report, where police would enforce what was called ‘pre-crime’ and punished people for crimes they never committed,” Henderson said.
“The problem is, those forecasts are only as good as the data they’re based on,” said Upturn’s David Robinson.
“Imagine if in your neighborhood, police frequently stopped people looking for reasons to make arrests, and wrote up whatever infractions they could find: open containers, smoking, a lane change without a turn signal. Meanwhile, across town, those same things don’t get written up.
“That’s part of the crime data we have today. And when you turn a computer loose on that data, you’ll hear talk about algorithms and so on, but the bottom line is that computers find patterns, and when they look in data that reflects bias, the patterns that they find and the decisions that they drive can turn out to be biased in turn. That was our core finding.”
Robinson added that that there is little transparency, accountability or public discussion about these systems, which are currently being considered by 150 or more police departments across the country.
“But we found little evidence that today’s systems live up to their claims, and significant reason to fear that they may reinforce disproportionate and discriminatory policing practices,” the study said.
However, defemders of the $23 million industry say that predictive policing benefits minority communities by stopping crimes before they happen.
"This is not Minority Report," said UCLA anthropology Professor P. Jeffrey Brantingham, who helped develop and run PredPol. “Minority Report is about predicting who will commit a crime before they commit it. This is about predicting where and when crime is most likely to occur, not who will commit it.”
A study published last year in the Journal of the American Statistical Association co-authored by G.O. Mohler, assistant professor of mathematics and computer science at Santa Clara University, Los Angeles Police Commander and Chief of Staff Sean Malinowski, and others stated that predictive policing “has proven effective in reducing crime” in high-crime hot spots during randomized controlled field trials.
“Police patrols using ETAS [epidemic-type aftershock sequence] forecasts led to an average 7.4% reduction in crime volume as a function of patrol time, whereas patrols based upon analyst predictions showed no significant effect,” the study found.
“We are focused on preventing crime,” PredPol CEO Larry Samuels said in a 2015 article in Police Magazine. “All we use in our analysis is the what, the where, and the when. We are constitutionally friendly and very, very accurate.”
But coalition member Ezekiel Edwards, director of the ACLU’s Criminal Law Reform Project, maintains that the use of historical crime data is the problem, not the solution.
“The ACLU’s chief concerns with predictive policing is simply put ‘garbage in, garbage out’. It is well known that crime data is notoriously suspect, incomplete, easily manipulated, and plagued by racial bias,” Edwards told reporters.
“Data on where crime occurs is dependent in part on when and where crime is reported, and in part on where the police deploy to find crime. Lots of crime goes unreported, and the police are selective in where they deploy to look for it.
“In addition, despite an excitement by law enforcement over using big data, ironically many police departments do a poor job of collecting data in a comprehensive, uniform, transparent and accessible manner,” he said.