Here’s an interesting article on predictive policing from Motherboard. People are concerned that if a particular area has been overpoliced in the past, that is where the algorithms are going to predict crime in the future and they will continue to be overpoliced. Others just don’t like the idea of proprietary algorithms. I think any of these concerns could be badly depending on how it is implemented, but I don’t see why the tool itself could not be implemented in a fair way. In fact, I don’t see why measures to prevent discrimination couldn’t be built into the algorithms themselves. If the algorithms say people in a particular area or in a particular demographic group are being arrested at higher rates, it could help the search for route causes and preventive measures to help a particular group revert back to the mean. Transparency seems good in principle, maybe publishing some generalized statistics and maps, but of course if it is too predictable exactly where the police are going to be and when, people could take advantage of that. You could try to get around this by balancing random and targeted patterns within the algorithm.
predictive policing
Leave a reply