Next Up, Precrime

The law enforcement community seems to like their tools to be creepier and creepier. We have everything from checkpoints where innocent civilians are accused of various crimes without cause (usually they’re accused of driving under the influence) to closed circuit cameras lining street corners in major metropolitan areas. Now the police of Santa Cruz, California are testing software that predicts where crimes will occur:

The arrests were routine. Two women were taken into custody after they were discovered peering into cars in a downtown parking garage in Santa Cruz, Calif. One woman was found to have outstanding warrants; the other was carrying illegal drugs.

But the presence of the police officers in the garage that Friday afternoon in July was anything but ordinary: They were directed to the parking structure by a computer program that had predicted that car burglaries were especially likely there that day.

The program is part of an unusual experiment by the Santa Cruz Police Department in predictive policing — deploying officers in places where crimes are likely to occur in the future.

There are two things I’ll note about this. First, relying on a computer program to deploy police officers for the entirety of a day seems like a bad idea as a criminal who figures out the algorithm would know where the police weren’t likely to be. Second, how long will it take until the output of this software becomes admissible in court as evidence?

The story states that the police didn’t actually catch the two suspects breaking into a car, they caught them looking inside of cars. Granted nine times out of ten that usually means those people are planning to break into a vehicle but until they actually have performed the action no crime has been committed. Wouldn’t it have been better to wait for the two suspects to actually perform a crime before arresting them? I say this both as a libertarian who’s disgusted by the fact somebody can be arrested for not actually breaking the law and as an engineer who works to ensure his software is properly tested.

How can the police know if the software works if they didn’t wait for the suspects to actually break into a car? All they know now is that the program was able to predict people would arrive in the parking garage and look inside of vehicles. That right there is a poorly executed test case and if I were one of the developers I’d be rather pissed at the officers’ execution of the test.

The story does mention that one suspect had an outstanding warrant and the other was carrying drugs (which isn’t a crime in my book). That’s all fine and good but the fact of the matter is these two situations only came to light after the police arrested the women for not actually doing anything. Due to that simply fact I would say everything else the police learned is irrelevant.

Furthermore I’d also say the software isn’t so much intelligent as simply programmed with a great deal of common sense:

On the day the women were arrested, for example, the program identified the approximately one-square-block area where the parking garage is situated as one of the highest-risk locations for car burglaries.

Wait… a structure which houses, potentially, hundreds of cars that remain mostly unprotected throughout the day is a likely spot for car burglaries? Well Hell’s bells everybody this software can figure out what any person with common sense could have told you without needing thousands of man hours in development time. I’m sure if you park a few police officers in the parking structure unannounced every day of the week you’re going to encounter quite a few people planning on breaking into other peoples’ cars (until the criminals figure out that the police are hanging around there every day, then those thugs will find a difference parking garage).