The Metropolitan Police and British Transport Police did share images with the King’s Cross Estate in London for its facial recognition technology, despite previously denying any involvement in the surveillance project, it has emerged.
Developers of the site – which is home to King’s Cross and St Pancras International stations, as well as restaurants, shops and cafes – said earlier this week that the system was used only to help both forces “prevent and detect crime in the neighbourhood and ultimately to help ensure public safety”.
The British Transport Police originally said that it had “not contributed, nor has benefited” from the technology, but is now “correcting” its position.
It claims that local teams based at King’s Cross actually worked with partners between 2016 and 2018 “to share images of a small number of convicted offenders, who routinely offended or committed anti-social behaviour (ASB) in the area”.
“This was legitimate action in order to prevent crime and keep people safe,” a spokesman said.
“Understandably, the public are interested in police use of such technologies, which is why we are correcting our position.”
Mayor of London Sadiq Khan said the original information provided by the Metropolitan Police was also “incorrect” and that “they have in fact shared images related to facial recognition with King’s Cross Central Limited Partnership”.
“I am informed that this ceased in 2018,” Mr Khan said.
“As a matter of urgency, I have asked for a report from the MPS on this concerning development and on their wider data-sharing arrangements, including what information has been shared and with whom.
“I apologise to the Assembly Member that the previous information provided was inaccurate.
“A fuller update will be provided to London Assembly Members as soon as I am able.”
Developers of the area said that it had two facial recognition cameras in operation at King’s Boulevard, which it stopped using in March 2018.
Usage of the technology has been under the spotlight after UK data and privacy watchdog the Information Commissioner’s Office (ICO) said it had launched an investigation last month.
Information Commissioner Elizabeth Denham said the watchdog is “deeply concerned about the growing use of facial recognition technology in public spaces” and is seeking “detailed information” about how it is used.
The revelation comes after activist Ed Bridges, from Cardiff, lost the world’s first legal challenge over police use of facial recognition technology.
The 36-year-old told the High Court that his face was scanned while Christmas shopping in 2017, and at a peaceful anti-arms protest in 2018.
His lawyers argued the use of automatic facial recognition (AFR) by South Wales Police caused him “distress” and violated his privacy and data protection rights by processing an image taken of him in public.
But Mr Bridges’s case was dismissed on Wednesday by two leading judges, who said the use of the technology was not unlawful, though he vowed to appeal against the ruling.
House Rules
We do not moderate comments, but we expect readers to adhere to certain rules in the interests of open and accountable debate.