In June we all watched as the Black Lives Matter movement took full force in response to the murder of George Floyd at the hands of a Minneapolis police officer. Since these events took place, the world has been questioning current law enforcement practices and looking for ways to improve moving forward.
Leading this review has been the United Nations, which recently released research titled ‘Preventing and Combating Racial Profiling by Law Enforcement Officials’. The paper puts the role of data in policing under scrutiny and looks at how these techniques might reinforce racial biases.
Given the rise of data and analytics in recent times, police departments – much like commercial businesses – around the world have been turning to data as a way to improve the accuracy, efficiency and effectiveness of decision making. In theory, using this data to inform decisions can be helpful when it comes to allocating resources and keeping communities safe.
Crime prevention app RTM Dx, for example, collates geolocation and crime data to measure the spatial correlation between various sites and crime rates. This information can then be used for future monitoring and pursuing.
However, the report has found that these data strategies can also reinforce racial bias, particularly when it relies on using historical data for predicting possible future events. The use of historical arrest data to build a predictive policing model may ultimately lead to over-policing and in turn more arrests in the same neighbourhoods.
“Big data and A.I. tools may reproduce and reinforce already existing biases and lead to even more discriminatory practices,” Dr. Verene Shepherd, who led the paper, said about the findings.
“Such data will deepen the risk of over-policing in the same neighborhood, which in turn may lead to more arrests, creating a dangerous feedback loop.”
As a way to minimise any racial prejudices that may emerge from data in the future, the United Nations has recommended states should focus on educating the people that are interpreting this data. Further education for law enforcement in regards to the ethical concerns surrounding such technology could not only help reduce racial profiling, but could also identify future race-related issues that might come from the technology.
The report also calls on governments to rethink how they collect data when it comes to law enforcement. It calls for the end in automated processing of personal data as it pertains to aspects that predict a person’s performance at work, economic situation, health and behaviour. It also calls on anonymised data relating to law enforcement practices such as identity checks and traffic stops to be made public.
As well as highlighting the way in which data can be problematic when it comes to reinforcing racial biases, the findings of the report also highlight the issues associated with using historical data as a way of predicting future outcomes for the wider business world.
Historical data – whether that be purchase activity or log files – is an extremely valuable data resource for any business. However, data strategies that rely too heavily on this historical data risk repeating past mistakes and limiting overall results.
Top-performing data initiatives are those that combine historical data with a range of different data sources to create a clearer picture of the topic.
At smrtr, we are working on our ‘smrtr for Good’ initiative to help use data to create a better world. As part of this program, we support PhD research into how data can be used to create positive change.
By Boris Guennewig, Co Founder & CTO at smrtr