source:E&T

London’s Metropolitan Police force is preparing to use facial recognition cameras to fight serious crime and find missing people.

 

Although facial-recognition cameras have not been regularly deployed before for these purposes, the Met has previously trialled the technology in East London, Central London, and in a one-off deployment during the 2017 Notting Hill Carnival. Even these limited deployments attracted criticism from privacy advocates and human rights experts.

Now, the Met will be using the technology more regularly to combat “serious and violent crime” and in the search for missing children and other vulnerable people. People who could be targeted by the technology include those suspected of committing gun and knife crime, and child abusers.

The cameras will be used to search for suspects placed on police or court watchlists; if they are detected by the cameras – which will be placed in locations chosen according to the likelihood of finding them – the footage will be reviewed by a human officer and they will be approached if they appear to be the suspect. They will then be asked to identify themselves and arrested if they are confirmed to be on the watchlist.

Assistant Commissioner Nick Ephgrave said that facial-recognition cameras would be deployed for the first time within a month.

“Every day our police officers are briefed about suspects they should look out for; live facial recognition improves the effectiveness of this tactic,” he said. “Similarly, if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.”

Ephgrave said that searching for suspects has long been a responsibility for the police and that the public expects the Met to use emerging technology to tackle crime: “Live facial recognition is about modernising this practice through technology to improve effectiveness and bring more offenders to justice.”

In order to keep the public informed about the practice, Ephgrave said that signs will be put up in the surrounding area to alert them to the facial-recognition cameras and leaflets will be handed out to passers-by to inform them about what is happening.

In response to the announcement, Big Brother Watch director Silkie Carlo commented: “This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK. It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81 per cent inaccurate.”

“This is a breathtaikng assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claims against the Met and the Home Secretary. This move instantly stains the new government’s human rights record and we urge an immediate reconsideration.”

The Met and other London authorities are under intense pressure to crack down on violent crime in the capital, particularly knife crime. However, the deployment of facial recognition cameras may be of limited effectiveness in the fight against crime, given their poor track record for accuracy, particularly with regards to ethnic minorities and women.

A demonstration run by the ACLU in 2018 used Rekognition (Amazon’s facial-recognition service, used by many police forces in the US) to compare the faces of Senators and Representatives against a database of 25,000 criminal mugshots. Rekognition found matches of at least 80 per cent confidence between 28 members of Congress, with ethnic minority Representatives disproportionately identified as criminals.

Despite issues around inaccurate matches and infringement on privacy, a landmark High Court case resulted in the use of facial recognition by South Wales Police being ruled as lawful.

Last year, the information commissioner Elizabeth Denham commented that scanning faces of people in public is “a potential threat to privacy that should concern us all”. Meanwhile, the European Commission is considering a temporary ban on facial recognition while a framework is developed for ensuring that the technology can be deployed ethically.