- Sputnik International
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

UK Police Halt Facial Recognition Tech Trial After Millions Secretly Snapped

© AP Photo / Danny Lawson/PAPolice in north Manchester. (File)
Police in north Manchester. (File) - Sputnik International
Greater Manchester Police have been forced to halt a controversial secret surveillance program, which monitored every visitor to the city's Trafford Center shopping precinct, which is visited by an estimated 30 million people annually.

GMP used Automatic Facial Recognition (AFR) technology to scan shoppers in the area for a period of six months, analyzing and storing the images of potentially millions without anything in the way of an official announcement, or consent from those covertly surveilled.

No Permission

However, the pilot scheme was halted after Surveillance Camera Commissioner Tony Porter raised a number of concerns about the project. In particular, he was anxious the scheme hadn't been signed off by senior officers at strategic command level, or subject to sufficient legal oversight.

Moreover, Porter believed the project's scope — blanket surveillance of all individuals in the area — was far too gargantuan given its relatively modest objectives — finding wanted criminals and missing people.

"Compared to the size and scale of the processing of all people passing a camera the group they might hope to identify was miniscule. The proportionality of the enterprise was effectively placed under due legal consideration. The police have stepped back from engagement having recognized their approach is not currently proportionate," he wrote in an official blog.

© Flickr / MikeyGMP Headquarters,
GMP Headquarters, - Sputnik International
GMP Headquarters,
Notably, over the course of the pilot's operation, the AFR technology made just one positive identification — a convict wanted on recall to prison.

"In April this year, Greater Manchester Police began to explore the use of automatic facial recognition with our partners at the Trafford Centre. This pilot involved a limited number of pictures and at no time were any names or personal details passed to colleagues at the Trafford Centre," a GMP spokesperson said.

History of Failure

GMP launched the pilot in April after being invited to take part by Trafford Center security bosses. It's the largest endeavour of its kind in UK history — previously AFR technology has been trialled at large, one-off events, such as London's Notting Hill Carnival and the Champions League in Cardiff.

Police forces across the country have set aside millions to purchase and develop AFR provisions of their own, although authorities' determination to adopt the technology is somewhat baffling given prior test-runs have made clear the resource is almost entirely inaccurate.

© AP Photo / Sang TanBritish police officers stand on duty during Europe's largest street festival, the Notting Hill Carnival in London, UK
British police officers stand on duty during Europe's largest street festival, the Notting Hill Carnival in London, UK - Sputnik International
British police officers stand on duty during Europe's largest street festival, the Notting Hill Carnival in London, UK
For instance, when deployed at the 2017 Notting Hill Carnival, the technology produced one accurate identification, and 95 'false positives'. Even more damningly, the two individuals who've been correctly identified by the Met's AFR systems since 2016 weren't criminals — one had been placed on an internal watch list by mistake, and the other was on a mental health-related, as someone who could potentially be a risk to themselves or others. Despite this less than illustrious history, the force intends to operate AFR systems at several other large events in future.

South Wales Police's AFR experience is only slightly less woeful, with their systems producing false positives 91 percent of the time. Impressively, the 2017 UEFA Champions League Final week produced 173 positive matches over seven days — although 2,297 were wrongly identified by the technology.

© AP PhotoA CCTV (Closed Circuit Television) camera is seen against the backdrop of Big Ben in central London
A CCTV (Closed Circuit Television) camera is seen against the backdrop of Big Ben in central London - Sputnik International
A CCTV (Closed Circuit Television) camera is seen against the backdrop of Big Ben in central London

While no one incorrectly identified has been arrested by the force as yet, officers have staged interventions with many, compelling them to prove their identities — an obvious inversion of the presumption of innocence, and the right to remain anonymous unless charged with an offence.

While AFR misidentification affects anyone and everyone, there is much evidence to suggest the technology's algorithms disproportionately misidentify black people and women.

For instance, a Massachusetts Institute of Technology study of the commercial use of artificial intelligence systems found the error rate of facial recognition software was 43 times higher for dark-skinned women than for light-skinned men.

Privacy groups have given up on fighting for facial recognition privacy, saying they’ve been overwhelmed by business interests. - Sputnik International
Amazon’s Facial Recognition Tech Fails Test, Finds US Congress Full of Criminals
Moreover, Big Brother Watch has documented how South Wales Police stored biometric photos of all 2,451 innocent people wrongly identified by the system for 12 months, a policy that may be unlawful. Facial recognition technology in any event operates in a legal grey area, not subject to regulation or statutory oversight — and Big Brother Watch, among others, strongly argue its use contravenes existing legislation.

"It's highly questionable whether the use of automated facial recognition is compatible with fundamental human rights — in particular, the rights to a private life and freedom of expression. The necessity of such biometric surveillance is highly questionable, and inherently indiscriminate scanning appears to be plainly disproportionate. As it stands, the risk automated facial recognition is fundamentally incompatible with people's rights under the Human Rights Act 1998 is yet to be considered," the rights group has written.

To participate in the discussion
log in or register
Заголовок открываемого материала