Monday, September 6, 2010

Predictive Policing


I read a book a couple of years ago – Super Crunchers: Why Thinking-by-Numbers Is the New Way to be Smart by Yale Law School professor and economist, Ian Ayres – which basically provides explanations and examples of how sophisticated “regression analysis” (using masses of past data to predict future trends and behavior) was so good now that what most of us believe are determinations best made by seasoned professionals – like medical diagnoses – are vastly more accurate when made by big fat computers programmed with massive amounts of data. So much of what people like or don’t like, how they react to weather patterns (according to Wal*Mal, for example, Floridians react to hurricanes by ordering a disproportionate level of strawberry Pop-Tarts!) to highway usage can be reasonably and accurately predicted from this manipulation of data.

Right-brained creatives and most folks who believe in free will have philosophical issues with this level of computer predictability; it just feels wrong, even though statistical evidence proves them incorrect. We’re individuals, they cry, not numbers! They don’t like that a statistical analysis of their buying habits can predict such personal events like divorce or collapse of their lifestyle, but credit-rating agencies are slorping this data up. The predictive validity is scary. And the feeling of having your most personal information – your privacy – invaded by numbers is disconcerting.

So what happens when police start entering decades of crime data, tied to weather, time of year, economic trends, political waves and other seemingly-unconnected data, and start playing with finding a computer program that can tell them when and where criminals are likely to strike next? Enter Tom Cruise in Minority Report? Science fiction has often been a predictor of science fact. With dwindling governmental budgets, deploying fewer street cops but targeting their presence where police efforts become maximized becomes less a wish than an essential.

Enter UCLA and the Los Angeles Police Department: “Predictive policing is rooted in the notion that it is possible, through sophisticated computer analysis of information about previous crimes, to predict where and when crimes will occur. At universities and technology companies in the U.S. and abroad, scientists are working to develop computer programs that, in the most optimistic scenarios, could enable police to anticipate, and possibly prevent, many types of crime… Some of the most ambitious work is being done at UCLA, where researchers are studying the ways criminals behave in urban settings…

“The LAPD has positioned itself aggressively at the center of the predictive policing universe, forging ties with the UCLA team and drawing up plans for a large-scale experiment to test whether predictive policing tools actually work. The department is considered a front-runner to beat out other big-city agencies in the fall for a $3-million U.S. Justice Department grant to conduct the multiyear tests… LAPD officials have begun to imagine what a department built around predictive tools would look like… Automated, detailed crime forecasts tailored to each of the department's 21 area stations would be streamed several times a day to commanders, who would use them to make decisions about where to deploy officers in the field.” Los Angeles Times (August 21st). Even before the economic collapse, Los Angeles – a vast geographical area (469 square miles) with a large, spread-out population – had one of the worst police officer-per capita ratios in the country. Their reaction time to crime is legendarily slow; they are always playing catch-up, with notoriously under-whelming results.

UCLA mathematician, George Mohler wrote an unpublished paper about criminal activity forecasting. The LA Times explains: “When a home is burglarized, for example, the same house and others in its immediate surroundings are at much greater risk of being victimized in the days that follow. The phenomenon is called an exact or near-repeat effect… The same dynamic can explain the way rival gangs retaliate against one another. And, although it is harder to pin down in more complex crimes that are motivated by passion or other emotions, experts believe it holds true there as well.

“Mohler wasn't all that interested in what it is about criminals that makes this so. He focused instead on adapting the math formulas and computer programs that seismologists use to calculate the probability of aftershocks, fitting them to crime patterns. (Aftershocks can occur hundreds of miles from an epicenter and many months after an earthquake, while the elevated risk of burglaries and other crimes tends to subside over a matter of weeks and several city blocks.)… Using LAPD data, Mohler tested his computer model on several thousand burglaries that occurred in a large section of the San Fernando Valley throughout 2004 and 2005. The results, he said, were far more effective than anything on the market today.”

Traffic and street cameras – especially those that result in automated traffic tickets being issued – seem Orwellian and intrusive, but doesn’t statistical regression analysis become more invasive? What’s your preference? Less intrusion and higher crime rates or…..

I’m Peter Dekom, and I find all this stuff really fascinating.

No comments: