A report has found the Metropolitan Police’s tests with live facial recognition (LFR) wrongly identify four out of five people. It said there are “significant concerns” about the technology. And as a result, Big Brother Watch said the Met must stop LFR deployment “urgently”.
Inaccurate and misleading
Researchers from the Human Rights, Big Data and Technology Project were present for six LFR tests by the Met starting in June 2018. And they found it would be “highly possible” that the trials wouldn’t stand up to legal challenges because the technology wrongly identified people four out of five times. As Sky News reported, researchers verified just eight of 42 people flagged up during trials. The report also highlighted the Met’s use of outdated watch lists.
The report also aired concerns over consent. This included questioning whether the Met made information sufficiently available for members of the public to give informed consent over entering a camera’s field of view. It also raised issues around withdrawing or refusing consent. One example included the police intervening with a person covering their face while walking past LFR. The report said:
treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent. In addition, the arrest of LFR camera avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of ‘surveillance creep’.
“Utterly damning”
Responding to the findings, Silkie Carlo of privacy group Big Brother Watch said:
This report is an utterly damning conclusion to the police’s dangerous experimentation with live facial recognition. It confirms what we have long warned – it’s inaccurate, lawless, and must be stopped urgently. This message is now coming not just from us, but from the independent reviewers commissioned by the Metropolitan Police themselves. The only question that remains is when will the police finally drop live facial recognition? The public’s freedoms are at stake and it is long overdue.
Carlo went on to say that the Met’s LFR cameras have “no place in Britain”. She also said that, in light of Big Brother Watch’s legal challenge against the Met, she hopes “the force will now decide not to use live facial recognition any further”.
Dangerous technology
The Met claimed its own analysis produced very different results. Its method, which Sky News said compares “successful and unsuccessful matches with the total number of faces processed”, led to just a 0.1% error rate. As a result, Duncan Ball, the Met’s deputy assistant commissioner, said:
We are extremely disappointed with the negative and unbalanced tone of this report… We have a legal basis for this pilot period and have taken legal advice throughout.
We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.
But LFR technology is facing criticism internationally for inaccurate and misleading results as well as for police misuse of data. This report shows drives home just how dangerous it will be, and the impact it will have on real people’s lives.
Featured image via YouTube – JSUK News