Beyond the hype of AI: predictive policing in practice

Reading Time: 2 minutes

With the advance of big data terms such as artificial intelligence (AI), deep learning and machine learning have been thrown about for the past five to ten years. To anyone purely familiar with the topic through scanning the popular science section of your local newsstand, it might seem as though this technology poses the single biggest threat to job security and society as we know it: robots will take over the world and we will have to adapt in order to avoid becoming redundant in the near future. While this makes very good material for airplane literature, little has been said about the actual use of AI and machine learning in organisations.

Marleen Huysman, professor at the School of Business and Economics and head of the KIN Center for Digital Innovation at VU University, wants to go beyond the popular narrative of AI as a threat to job security and address what has so far not been addressed by either academic literature or popular scientific literature: the possibilities AI offers in the work place. The AI@WorkLab at KIN studies how AI is actually being designed and developed, and how it is used as a tool in professional environments.

One example of AI in the workplace is predictive policing. Dick Willems, who is a data scientist at the Amsterdam police department, designed the Criminaliteits Anticipatie Systeem (CAS) algorithm based on logistic regression, which predicts where crime is likely to occur. Initially, these predictions were done by analysts who would create a map with potential crime hotspots based on the information available to them. However, these predictions were not solely based on data, but also on a blend of the analyst’s experience, knowledge, possible bias and a hint of guesswork. 

The CAS algorithm, on the other hand, works purely with facts from victim reports (such as time, place, type of crime) and any additional statistical information publicly available about different neighbourhoods. After being fed heaps of data, it spits out a prediction for the next week. It is important to note that this technological enhancement does not stand on its own; it is useless without police and intelligence officers interpreting the data, and embedding it in the necessary context. 

Lauren Waardenburg (PhD candidate at VU University) spent three years in the police force researching the effects of predictive policing. Her study revealed that rather than deskilling employees, this technology is reskilling them. For example, police officers are now responsible for the input of data that the algorithm needs in order to make predictions, resulting in an increase in administrative duties. What is also interesting is the change in dynamics between police officers and intelligence officers. Thanks to the required data being made available through the CAS algorithm, the tasks of intelligence officers have become more qualitative in nature: putting the data in context for officers and correcting the data results when necessary.

All in all, there are of course many fields in which machine learning might be deployed, and the effects of this may be vastly different. Nonetheless, this particular case shows how we can use AI technologies to our advantage, and that it can offer some useful benefits for both employees as well as employers, though in often unexpected ways.

Author: Sifra Wee