Unopened Space

posted: 09/07/19



This project focused on the use of algorithms in relation to the police and public security. I specifically focused on the Predictive Policing system that is used in The Netherlands known as CAS (Criminaliteits Anticipatie Systeem). This system, that is built by the Police, uses data from recorded criminal activity and locational information to predict when and where a crime is likely to occur. The predictions manifest themselves as 125m2 hotspots. Once one of these spaces is considered a hotspot, the police will pay closer attention to it.


Whilst in theory the system should work, the outcome is far from perfect. The data used by the police contains all the biases that the police are prone to which includes discrimination against minorities and working class communities. The predictive system only amplifies these biases by solidifying them in a form that seems to represent a totality. This vision of the future is used and trusted by the police yet can lead to increased policing in already over policed areas and feedback loops when crime is found in hotspot areas yet not in others.


This project manifested itself in three pieces; a video, a sympoisum that was organised by all NLN students and a written essay that was published in a book along with other students writing.


The symposium, Violent Patterns, saw five different speakers from a variety of backgrounds speak on the topic of public security under algorithmic control.


The essay is an accompaniment to the video which can be seen below. In the video I used data collected by the police that is used in their predictive system to create a model of The Hague that is covered in a preemptive coating. Using a neural network trained on images of a supposedly high crime landscape, the machine creates a predictive dystopian vision of the city that illustrates the flaws of the predictive system. This video is shows both the process and result of the project.