The Equalities and Human Rights Commission (EHRC) has called for the public use of facial recognition to be halted.
Concerns have been raised about facial recognition’s potential to
automate racial discrimination and hinder freedom of expression.
The UK, the second most surveilled nation after China, has been at
the forefront of testing facial recognition systems in the West. Police
in London and South Wales have tested facial recognition in stadiums,
arenas, and shopping centres.
Facial recognition tests in the UK so far have been nothing short of a
complete failure. An initial trial, at the 2016 Notting Hill Carnival,
led to not a single person being identified. A follow-up trial the
following year led to no legitimate matches but 35 false positives.
An independent report into the Met Police’s facial recognition
trials, conducted by Professor Peter Fussey and Dr Daragh Murray last
year, concluded that it was only verifiably accurate in just 19 percent
Last month, Met Police Chief Commissioner Cressida Dick dismissed critics of law enforcement using facial recognition systems as being “highly inaccurate or highly ill-informed.”
The EHRC wants public use of facial recognition to be halted until
the technology and its impact has been independently scrutinised and
laws governing its use are improved. However, last September, the high
court in Cardiff ruled that the police’s use of automatic facial
recognition to find people in crowds is lawful.
In a report to the UN on civil and political rights in the UK, the
EHRC said: “Evidence indicates many AFR algorithms disproportionately
misidentify black people and women and therefore operate in a
potentially discriminatory manner … Such technologies may replicate and
magnify patterns of discrimination in policing and have a chilling
effect on freedom of association and expression.”
The calls by the EHRC, in addition to organisations like Amnesty
International and the ACLU, puts more pressure on law enforcement to
halt their trials.