Cardiff man receives go-ahead for legal challenge to police use of facial recognition technology

Cardiff resident Ed Bridges has been given the go-ahead to begin the first legal challenge to a UK police force’s use of automated facial recognition (AFR) technology in what will be a critical nationwide test of the state’s power to deploy radical biometric surveillance methods. Represented by Human Rights organisation Liberty, Bridges had threatened legal action against South Wales Police if the force didn’t immediately end its use of AFR technology in public spaces.

Chief Constable Matt Jukes has now confirmed the force will not seek to prevent the case from taking place, paving the way for the High Court to review South Wales Police’s ongoing deployment of the technology. Jukes has said that South Wales Police welcomes the scrutiny of the High Court on this issue.

Surveillance cameras equipped with AFR software scan the faces of passers-by, making unique biometric maps of their faces. These maps are then compared to and matched with other facial images on bespoke, but – according to Liberty – “often error-ridden” police databases.

South Wales Police has used facial recognition in public spaces on at least 22 occasions since May last year, and Bridges believes his face was scanned by the force at both a peaceful anti-arms protest and while doing his Christmas shopping. He will seek to challenge the use of AFR technology in court because it – again, according to Liberty – “violates the privacy rights of everyone within range of the cameras, has a chilling effect on peaceful protest, discriminates against women and BAME people and breaches data protection laws.”

Members of the public have so far donated more than £3,450 to Bridges’ challenge via crowdfunding site CrowdJustice.

Ed Bridges said: “This dystopian style of policing has no place in Cardiff or anywhere else, and I’m delighted this legal challenge will go ahead. Without warning, the police has used this invasive technology on peaceful protesters and thousands of people going about their daily business, providing no explanation of how it works and no opportunity for us to consent. The police’s indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour. It needs to be challenged and needs to stop.”

Megan Goulding, lawyer for Liberty and solicitor for Ed Bridges, added: “We’re pleased South Wales Police has recognised the importance of this issue and agreed to a Judge reviewing its actions. The police’s creeping roll-out of facial recognition is not authorised by any law, guided by any official policy or scrutinised by any independent body. Scanning the faces of thousands of people whenever they see fit and comparing them to shady databases which can contain images sourced from anywhere at all has seriously chilling implications for our freedom.”

South Wales Police and facial recognition

AFR technology scans the faces of all passers-by in real-time. The software measures their biometric facial characteristics, creating unique facial maps in the form of numerical codes. These codes are then compared to those of other images on bespoke databases.

Three UK police forces have used AFR technology in public spaces since June 2015 – South Wales, the Metropolitan Police Service and Leicestershire Police. South Wales has been at the forefront of its deployment, using the technology in public spaces at least 20 times.

Ed Bridges’ face has likely been mapped and his image stored at least twice. He believes he was scanned as a passer-by on a busy shopping street in Cardiff in the days before Christmas, and then again while peacefully protesting outside the Cardiff Arms Fair in March this year.

South Wales Police has admitted it has used AFR technology to target petty criminals, such as ticket touts and pickpockets outside football matches, but they have also used it on peaceful protesters.

On 27 March this year, the police used AFR technology at a protest outside the Defence, Procurement, Research, Technology and Exportability Exhibition – the ‘Cardiff Arms Fair’. Ed Bridges attended the protest and believes that he, like many others there, was scanned by the AFR camera opposite the fair’s main entrance.

Protestors were not aware that facial recognition would be deployed and the police did not provide any information at the time of the event.

Freedom of Information requests have revealed that South Wales Police’s use of AFR technology has resulted in ‘true matches’ with less than 9% accuracy – 91% of ‘matches’ were misidentifications of innocent members of the public.

South Wales Police has wrongly identified 2,451 people, 31 of whom were asked to confirm their identities. Only 15 arrests have been linked to the use of AFR.

On one occasion – at the 2017 Champions League final in Cardiff – the technology was later found to have wrongly identified more than 2,200 people as possible criminals.

Images of all passers-by, whether or not they are true matches, are stored by the force for 31 days and potentially without their knowledge.

Members of the public scanned by AFR technology have not provided their consent and are often completely unaware that it’s in use. It’s not authorised by any law and the Government has not provided any policies or guidance on it. No independent oversight body regulates its use.

Specifics of the Bridges case

Ed Bridges is taking legal action against South Wales Police because he feels the force’s use of AFR technology:

*Violates the general public’s right to privacy by indiscriminately scanning, mapping and checking the identity of every person within the camera’s range and capturing personal biometric data without consent (and can lead to innocent people being stopped and questioned by police)

*Interferes with freedom of expression and protest rights, having a chilling effect on people’s attendance at public events and peaceful protests. The presence of a police AFR van can be highly intimidating and affect people’s behaviour by sending the message that they are being watched and can be identified, tracked and marked for further police action.

*Discriminates against women and BAME people. Studies have shown that AFR technology disproportionately misidentifies female and non-white faces, meaning they are more likely to be wrongly stopped and questioned by police and to have their images retained

*Breaches data protection laws. The processing of personal data cannot be lawful because there is no law providing any detailed regulation of AFR use. The vast majority of personal data processed by the technology is also irrelevant to law enforcement, belonging to innocent members of the public going about their business,and so the practice is both excessive and unnecessary

Metropolitan Police Service trials ongoing

Detective Superintendent Bernie Galopin, the operational lead for live facial recognition at the Metropolitan Police Service, has stated on the force’s website: “Trials of the Met’s live facial recognition technology are ongoing, and the equipment is now being tested in a range of policing environments including public order events, sports events and crowded public spaces. The technology was recently tested at the Port of Hull for the first time by Humberside Police, and was most recently used at Stratford Station on Thursday 28 June alongside a proactive knife arch operation run by the Met’s Violent Crime Task Force and British Transport Police. The deployment formed part of the Met’s ongoing trial of the technology, and was used to further assess how it can support standard policing activity and assist in tackling violent crime.”

Galopin continued: “As with all previous deployments, the technology was used overtly. Information leaflets were handed to members of the public, posters were placed in the area and officers engaged with members of the public to explain the process and technology. Officers on the knife arch operation recovered two large knives and arrested two individuals. There were no arrests from the use of live facial recognition, but this deployment formed an important part of ongoing trials of the technology. A full review of its use will take place once the trials have been completed.”

In addition, Galopin stated: “It’s important to note all the faces on the Watch List used during the deployment were of people wanted by the Met and the courts for violence-related offences. If the technology generated an alert to signal a match, police officers on the ground reviewed the alert and carried out further checks to confirm the identity of the individual. All alerts against the Watch List will be deleted after 30 days, while faces in the database that didn’t generate an alert were deleted immediately.”

Galopin and his colleagues will now consider the results and learning from this use of the technology. “We have committed to ten trials of live facial recognition and will consider further deployments over the coming months. The trial of the technology will also be subject to a full and independent evaluation at the end of this year.”

About the Author
Brian Sims BA (Hons) Hon FSyI, Editor, Risk UK (Pro-Activ Publications) Beginning his career in professional journalism at The Builder Group in March 1992, Brian was appointed Editor of Security Management Today in November 2000 having spent eight years in engineering journalism across two titles: Building Services Journal and Light & Lighting. In 2005, Brian received the BSIA Chairman’s Award for Promoting The Security Industry and, a year later, the Skills for Security Special Award for an Outstanding Contribution to the Security Business Sector. In 2008, Brian was The Security Institute’s nomination for the Association of Security Consultants’ highly prestigious Imbert Prize and, in 2013, was a nominated finalist for the Institute's George van Schalkwyk Award. An Honorary Fellow of The Security Institute, Brian serves as a Judge for the BSIA’s Security Personnel of the Year Awards and the Securitas Good Customer Award. Between 2008 and 2014, Brian pioneered the use of digital media across the security sector, including webinars and Audio Shows. Brian’s actively involved in 50-plus security groups on LinkedIn and hosts the popular Risk UK Twitter site. Brian is a frequent speaker on the conference circuit. He has organised and chaired conference programmes for both IFSEC International and ASIS International and has been published in the national media. Brian was appointed Editor of Risk UK at Pro-Activ Publications in July 2014 and as Editor of The Paper (Pro-Activ Publications' dedicated business newspaper for security professionals) in September 2015. Brian was appointed Editor of Risk Xtra at Pro-Activ Publications in May 2018.

Related Posts