Dutch and Brits discuss opportunities and risks of 'Predictive Policing'

Police are trawling through large amounts of data and searching for patterns to assess possible crime risks. But what are the risks of 'Predictive Policing' itself? That, and more, was discussed by a panel of Dutch and British academics, a police officer and the director of data mining and analytics firm at a public diplomacy event at The Royal Society on 4 December, organised by the Dutch Embassy in the UK and the Department of Digital Humanities at King's College London.

Predictive Policing Public Diplomacy event

Predictive Policing Public Diplomacy event in London

Data analyses and risk assessments

The application of big data to make predictions about future behaviour is increasingly being applied in policing and by social services. That is why it is important to publicly discuss this development and exchange knowledge about it. The practice of predictive policing consists of automated searches for patterns in large amounts of data that are fed to decision makers. Examples include risk assessments of prisoners' behaviour as well as for targeted patrolling.

A panel of Dutch and British data specialists

Practitioners and leading academics in the field of security practices, intelligence, and criminology from the Netherlands and the United Kingdom were invited to discuss cases of predictive analysis in the context of its societal implications and ethical dilemmas from privacy to biased algorithms. The panel consisted of data specialist Reinder Doeleman of Police Amsterdam, CEO of Xantura Wajid Shafiq, professor Claudia Aradau of King's College London, professor Bob Hoogenboom of Nyenrode Business University, and professor Marc Schuilenburg of VU University Amsterdam.

Transparency about data and algorithms imperative

Can predictive policing make our societies safer, as some say, or is it an unsafe practice itself, as others fear? When asked whether predictive policing may result into algorithmic bias and profiling of individuals and neighbourhoods, panelist Hoogenboom stated that "data is not knowledge. It needs to be interpreted." Panelist Doeleman stated that 'predictive policing' may be an inaccurate term as the police is not predicting crime, but computing the probability. Panelist Aradau pointed out that data are deliberately grouped to find patterns, which means there is inherent "discrimination" of the data. She stressed that it is therefore imperative to be transparant about the algorithms used. Panelist Shafiq argued that transparancy about the data and algorithms can actually make policing less bias than it is now. "When you see that the system is skewed towards ethnicity, you can alter the algorithm." The topic of "turning things around by making decision making more transparant" got people talking. One of the overall conclusions was that big data could be useful to prevent crime, as well as prevent police misconduct, provided the algorithms are completely transparent.

Public diplomacy events in the future

Check our website regularly, like us on Facebook, and follow us on Twitter and Instagram to stay tuned about public events organised by the Dutch Embassy in the future.