Home News Liberty report “exposes police forces’ use of discriminatory data to predict crime”

Liberty report “exposes police forces’ use of discriminatory data to predict crime”

by Brian Sims

At least 14 UK police forces have used or intend to use discriminatory computer programs to predict where crime will be committed and by whom. That’s according to research results published by Human Rights campaigning organisation Liberty. The new report entitled ‘Policing by Machine’ collates the results of 90 Freedom of Information requests sent to every force in the UK, “laying bare the full extent of biased ‘predictive policing’ for the first time” and how it “threatens everyone’s rights and freedoms”.

The Liberty document reveals that 14 forces are using, have previously used or are planning to use “shady algorithms” which ‘map’ future crime or predict who will commit or be a victim of crime using biased police data.

The report exposes police algorithms entrenching pre-existing discrimination, directing police officers to patrol areas which are already disproportionately over-policed. Liberty also notes predictive policing programs which assess an individual’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence of a sexual offence based on offensive profiling.

In addition, Liberty points towards “a severe lack of transparency” with the public given very little information as to how predictive algorithms reach their decisions. According to the civil rights group, even the police do not understand how the machines come to their conclusions.

Also, Liberty’s report highlights the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and “accepting its indecipherable recommendation as correct”.

Hannah Couchman, advocacy and policy officer for Liberty, stated: “Predictive policing is sold as innovation, but the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a ‘neutral’ technological veneer that affords false legitimacy. Life-changing decisions are being made about us that are impossible to challenge. In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

Biased machines and predictive mapping programs

Predictive policing algorithms analyse troves of historical police data, but Liberty feels this data presents a misleading picture of crime due to biased policing practices. The computer programs are not neutral, according to Liberty, and some are even capable of learning, becoming more autonomous in their predictions and entrenching pre-existing inequalities “while disguised as cost-effective innovation”.

Predictive mapping programs use police data about past crimes to identify ‘hot spots’ of high risk on a map. Police officers are then directed to patrol these areas, many of which will already be subject to policing interventions that are disproportionate to the level of crime in that area.

The following police forces have used or are planning to use predictive mapping programs: Avon and Somerset Police, Cheshire Constabulary, Dyfed Powys Police, Greater Manchester Police, Kent Police, Lancashire Constabulary, Merseyside Police, the Metropolitan Police Service, Northamptonshire Police, Warwickshire and West Mercia Police, the West Midlands Police and West Yorkshire Police.

Individual risk assessment programs predict how people will behave, including whether they are likely to commit – or even be the victims of – certain crimes. Durham Constabulary has used a program called the Harm Assessment Risk Tool (HART) since 2016. It uses machine learning to assess the likelihood of a person committing an offence, but according to Liberty is designed to overestimate the risk.

HART bases its prediction on 34 pieces of data, including personal characteristics such as age, gender and postcode, which could encourage dangerous profiling. It has also considered factors such as “cramped houses” and “jobs with high turnover” when deciding the probability of a person committing crime.

Avon and Somerset Police’s risk assessment program even predicts the likelihood of a person perpetrating or suffering serious domestic violence or violent sexual offences.

Individual risk assessment programs are being used by Avon and Somerset Police, Durham Constabulary and the West Midlands Police.

Threat to privacy rights

Like any data collected from society, predictive policing programs reflect pre-existing patterns of discrimination, further embedding them into policing practice. Mapping programs direct officers to attend already over-policed areas, while individual risk assessment programs encourage an approach to policing based on discriminatory profiling “lending unwarranted legitimacy” to these tactics.

Predictive algorithms also encourage reliance on Big Data – the enormous quantities of personal information accumulated about everyone in the digital age – which is then analysed to make judgements about people’s character, in turn “violating their privacy rights”.

This problem is compounded by the fact that the public – and the police – don’t know how the programs arrive at a decision. This means they’re not adequately overseen, and the public cannot hold them to account or properly challenge the predictions they make.

Recommendations in the report

Liberty’s report makes a number of recommendations, including:

*Police forces in the UK must end their use of predictive mapping programs and individual risk assessment programs

*At the very least, police forces in the UK should fully disclose information about their use of predictive policing programs. Where decision-making is informed by predictive policing programs or algorithms, this information needs to be communicated to those directly impacted by their use, and the public at large, in both a transparent and accessible way

*Investment in digital solutions for policing should focus on developing programs that actively reduce biased approaches to policing. A Human Rights impact assessment should be developed in relation to new digital solutions, which should then be rights-respecting by default and design

You may also like

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More