Chapter 3:
Criminal Justice
Predictive Policing: From Neighborhoods to Individuals
There is no public, comprehensive description of the algorithm’s input.
In February 2014, the Chicago Police Department (CPD) made national headlines for sending its officers to make personal visits to residents considered most likely to be involved in a violent crime. The selected individuals were not necessarily under investigation, but had histories that implied that they were among the city’s residents most likely to be either a victim or perpetrator of violence. The officers’ visits were guided in part by a computer-generated “Heat List”: the result of an algorithm that attempts to predict involvement in violent crime. City officials have described some of the inputs used in this calculation—it includes some types of arrest records, for example—but there is no public, comprehensive description of the algorithm’s input.
The visits were part of a new “Custom Notification Program,” which sends police (or sometimes mails letters) to peoples’ homes to offer social services and a tailored warning.[52] For example, officers might offer information about a job training program or inform a person that federal law provides heightened sentences for people with certain prior felonies.[53] The city reports that the contents of a notification letter are based on an analysis of “prior arrests, impact of known associates, and potential sentencing outcomes for future criminal acts.”[54] Although some of these visits have been poorly received,[55] the department argues that the outreach efforts may already have deterred crime.[56] Mayor Emanuel recently claimed that, of the 60 interventions that have already taken place, “none of the notified individuals have been involved in any new felony arrests.”[57]
The Heat List is a rank-order list of people judged most likely to be involved in a violent crime, and is among the factors used to single people out for these new notifications. The CPD reports that the heat list is “based on empirical data compared with known associates of the identified person.”[58] However, little is known about what factors put people on the heat list, and a FOIA request to see the names on the list was denied on the grounds that the information could “endanger the life or physical safety of law enforcement personnel or [some] other person.”[59] Media outlets have reported that various types of data are used to generate the list, including arrests, warrants, parole status, weapons and drug-related charges, acquaintances’ records, having been a victim of a shooting or having known a victim,[60] prison records, open court cases, and victims’ social networks.[61] The program’s designer, Illinois Institute of Technology (IIT) Professor Miles Wernick, has denied that the “algorithm uses ‘any racial, neighborhood, or other such information’ in compiling the list.”[62]
Cities across the country are expanding their use of data in law enforcement. The most common applications of predictive technology are to assist in parole board decisions[63] and to create heat maps of the most likely locations of future criminal activity in order to more effectively distribute police manpower. Such systems have proven highly effective in reducing crime, but they may also create an echo chamber effect as crimes in heavily policed areas are more likely to be detected than the same offenses committed elsewhere. This effect may lead to statistics that overstate the concentration of crime, which can in turn bias allocations of future resources.
Chicago’s experiment is one of several of a new type, in which police departments move beyond traditional geographic “crime mapping” to instead map the relationships among city residents. Specifically, identifying individuals for tailored intervention is the trend most likely to expand in the future of predictive policing—raising important questions on how to ensure justice continues to be protected through machine systems. Other districts are already working with academics to develop similarly styled programs, including one in Maryland that aims to “predict which of the families known to social services are likely to inflict the worst abuses on their children.”[64] In projects like these, automated predictions of future bad behavior may arise—and may be acted upon—even without direct evidence of wrongdoing. Such systems will sometimes make inaccurate predictions, and when they do, their mistakes may create unjustified guilt-by-association, which has historically been anathema to our justice system.
Even as they expand their efforts to collect data, city governments often do not have the academic resources to analyze the vast amounts of data they are aggregating. They are often partnering with private or academic institutions to assist in the process. In Chicago, the city is working with the MacArthur-backed Crime Lab to analyze the effectiveness of various programs, including things like “Becoming A Man,” a program that focuses on violence prevention among at-risk youth.[65] These partnerships allow the city to expand the ways it uses the data it collects, and may unlock significant benefits (by, for example, demonstrating the effectiveness of non-punitive crime reduction programs). At the same time, the private actors conducting these and other analyses should be held to at least the same standards of accountability and transparency that would apply if the city were analyzing its data internally.
[52] Chicago Police Department, D13-09, Custom Notifications In Chicago – Pilot Program (2013), http://directives.chicagopolice.org/directives-mobile/data/a7a57bf0-13fa59ed-26113-fa63-2e1d9a10bb60b9ae.html (“The letter will be specific to the identified individual and incorporate those factors known about the individual inclusive of prior arrests, impact of known associates, and potential sentencing outcomes for future criminal acts.”).
[53] Thomas Frisbie, Chicago Police ‘custom notifications’: Is it profiling?, Chicago Sun-Times (Feb. 26, 2014), http://voices.suntimes.com/early-and-often/backtalk/chicago-police-custom-notifications-is-it-profiling (“[F]ederal law says that if you have certain felonies in your criminal history and you get caught with a gun you can be prosecuted as a career armed criminal and [can] face a minimum of 15 years in prison[.]”).
[54] Chicago Police Department, supra note 52. (“The Custom Notification is predicated upon national research that concluded certain actions and associations within an individual’s environment are a precursor to certain outcomes should the individual decide to or continue to engage in criminal behavior.”)
[55] Jeremy Gorner, Chicago Police Use ‘Heat List’ as Strategy to Prevent Violence, Chicago Tribune (Aug. 21, 2013), http://articles.chicagotribune.com/2013-08-21/news/ct-met-heat-list-20130821_1_chicago-police-commander-andrew-papachristos-heat-list (“‘I haven’t done nothing that the next kid growing up hadn’t done. Smoke weed. Shoot dice. Like seriously?’ an incredulous McDaniel said while recalling the recent visit from police brass with a Tribune reporter.”).
[56] Id.
[57] Robin Kelly, Kelly Report 2014: Gun Violence in America 15 (2014), http://robinkelly.house.gov/sites/robinkelly.house.gov/files/wysiwyg_uploaded/KellyReport_1.pdf.
[58] Chicago Police Department, supra note 52.
[59] Letter from P.O. Cronin, Assistant Freedom of Information Officer, Chicago Police Dep’t, to Matthew Stroud, Reporter, The Verge (Jan. 6, 2014), http://cdn2.sbnation.com/assets/4020793/Stroud-CPD-FOIA.jpg (“Endanger the life or safety of law enforcement personnel or any other person.”).
[60] Kristal Hawkins, Heat list’ brings Minority Report-style police attention for likely offenders, Chicago Crime Library (Feb. 24, 2014), http://www.crimelibrary.com/blog/2014/02/24/heat-list-brings-minority-report-style-police-attention-for-likely-offenders-in-chicago/index.html.
[61] Mark Guarino, Can Math Stop Murder?, Christian Science Monitor (Jul. 20, 2014), http://www.csmonitor.com/USA/2014/0720/Can-math-stop-murder-video.
[62] Matt Stroud, The Minority Report: Chicago’s New Police Computer Predicts Crimes, But is it Racist?, The Verge (Feb. 19, 2014), http://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist.
[63] Prison Breakthrough, The Economist (Apr 19, 2014), http://www.economist.com/news/united-states/21601009-big-data-can-help-states-decide-whom-release-prison-prison-breakthrough (“Four-fifths of parole boards now use ‘risk-assessment’ software) technology, says Joan Petersilia of Stanford University.”).
[64] Don’t Even Think About It, The Economist (Jul. 18, 2013), http://www.economist.com/news/briefing/21582042-it-getting-easier-foresee-wrongdoing-and-spot-likely-wrongdoers-dont-even-think-about-it.
[65] William Harms, City Cites Crime Lab Data in Funding Innovative Youth Program, UChicago News (Feb. 7, 2013), http://news.uchicago.edu/article/2013/02/07/city-cites-crime-lab-data-funding-innovative-youth-program.