By: News Archive
Facial recognition technology, being trialled by two major police forces in Britain, should be subjected to more rigorous testing and transparency, according to new research from the University of East Anglia (UEA) and Monash University.
Facial recognition technology (FRT) involves the identification of an individual based on an analysis of the geometric features of his or her face, and a comparison between the algorithm created from the captured image and one already stored, such as from a custody image or social media account. The technology was first tested in public gatherings in 2014, when Leicestershire Police trialled a ‘Neoface’ facial recognition system, later using the technology to identify ‘known offenders’ at a music festival with 90,000 concertgoers.
The Leicestershire Police and the other two forces trialling FRT – the Metropolitan Police Service and the South Wales Police – argue the technology is lawful and its use in surveillance operations is proportionate. But researchers from UEA and Monash University in Australia say the technology could violate human rights. They argue there has not been sufficient statistical information about the trials made publically available for scrutiny. The limited outcomes that have been shared, the researchers say, have shown high false-positive identification rates and a low number of positive matches with ‘known offenders’.
Furthermore, the researchers say the trials are a costly use of public funds: £200,000 for the Met Police trials and £2.6 million for those run by the South Wales Police.
The research, led by Dr Joe Purshouse of the UEA School of Law, and Prof Liz Campbell of Monash University, will be published on February 8, 2019 in the journal Criminal Law Review.
Dr Purshouse, a lecturer in criminal law, said: “These FRT trials have been operating in a legal vacuum. There is currently no legal framework specifically regulating the police use of FRT.
“Parliament should set out rules governing the scope of the power of the police to deploy FRT surveillance in public spaces to ensure consistency across police forces. As it currently stands, police forces trialling FRT are left to come up with divergent, and sometimes troubling, policies and practices for the execution of their FRT operations.”
A key concern of the researchers is around the ‘watch list’ databases of facial images assembled from lists of wanted suspects and missing persons, but also other ‘persons of interest’. There is no legal prohibition of police forces taking images from the internet or social media accounts to populate the ‘watch lists’.
Dr Purshouse and Prof Campbell say there is a risk that people with old or minor convictions could be targeted by FRT, as well as those with no convictions whose images are retained and used by police after an arrest that did not lead to a conviction.
The accuracy of the technology has been brought into question by the researchers, leading to concerns that some individuals might be disproportionately included on ‘watch lists’. The limited independent testing and research into FRT technology indicates that numerous FRT systems misidentify ethnic minorities and women at higher rates than the rest of the population. A disproportionate number of custody images are of black and minority ethnic groups, and as these images are routinely used to populate FRT databases, there is a particular risk that members of the public from black or ethnic minority backgrounds will be mistakenly identified as ‘persons of interest’.
Dr Purshouse said: “There appears to be a credible risk that FRT technology will undermine the legitimacy of the police in the eyes of already over-policed groups.”
The police forces trialling FRT say the technology has been effective in preventing crime and ensuring public safety. The researchers say that currently there is no meaningful way of measuring success, but that the technology might be deterring those who could pose a threat to the public from attending gatherings where FRT surveillance is known to be in use.
The researchers say the use of FRT surveillance is on the rise without sufficient reflection on its aims and consequences. The ways in which it has the potential to interfere with citizens’ privacy related rights are multifaceted and complex, and without a full understanding of this potential we cannot hope to adequately regulate this form of policing technology.
Dr Purshouse added: “Rather than gradually becoming a pervasive and chilling feature of public life, FRT surveillance should only be targeted against credible and serious threats to public safety.”
‘Privacy, Crime Control and Police Use of Automated Facial Recognition Technology’, by Joe Purshouse and Liz Campbell, is published February 8, 2019 in the journal Criminal Law Review.
Researchers at the University of East Anglia (UEA) are launching a ground-breaking project to improve the lives of people affected by smell disorders.
Read moreNurses around the world use intuition to work out how sick a patient is before triaging for treatment according to new research from the University of East Anglia.
Read moreA new study published today shows the number of sexual partners we have changes as we age and there are some surprising results.
Read more