GMP used Automatic Facial Recognition (AFR) technology to scan shoppers in the area for a period of six months, analyzing and storing the images of potentially millions without anything in the way of an official announcement, or consent from those covertly surveilled.
No Permission
However, the pilot scheme was halted after Surveillance Camera Commissioner Tony Porter raised a number of concerns about the project. In particular, he was anxious the scheme hadn't been signed off by senior officers at strategic command level, or subject to sufficient legal oversight.
Moreover, Porter believed the project's scope — blanket surveillance of all individuals in the area — was far too gargantuan given its relatively modest objectives — finding wanted criminals and missing people.
"Compared to the size and scale of the processing of all people passing a camera the group they might hope to identify was miniscule. The proportionality of the enterprise was effectively placed under due legal consideration. The police have stepped back from engagement having recognized their approach is not currently proportionate," he wrote in an official blog.
"In April this year, Greater Manchester Police began to explore the use of automatic facial recognition with our partners at the Trafford Centre. This pilot involved a limited number of pictures and at no time were any names or personal details passed to colleagues at the Trafford Centre," a GMP spokesperson said.
History of Failure
GMP launched the pilot in April after being invited to take part by Trafford Center security bosses. It's the largest endeavour of its kind in UK history — previously AFR technology has been trialled at large, one-off events, such as London's Notting Hill Carnival and the Champions League in Cardiff.
Police forces across the country have set aside millions to purchase and develop AFR provisions of their own, although authorities' determination to adopt the technology is somewhat baffling given prior test-runs have made clear the resource is almost entirely inaccurate.
South Wales Police's AFR experience is only slightly less woeful, with their systems producing false positives 91 percent of the time. Impressively, the 2017 UEFA Champions League Final week produced 173 positive matches over seven days — although 2,297 were wrongly identified by the technology.
While no one incorrectly identified has been arrested by the force as yet, officers have staged interventions with many, compelling them to prove their identities — an obvious inversion of the presumption of innocence, and the right to remain anonymous unless charged with an offence.
While AFR misidentification affects anyone and everyone, there is much evidence to suggest the technology's algorithms disproportionately misidentify black people and women.
For instance, a Massachusetts Institute of Technology study of the commercial use of artificial intelligence systems found the error rate of facial recognition software was 43 times higher for dark-skinned women than for light-skinned men.
"It's highly questionable whether the use of automated facial recognition is compatible with fundamental human rights — in particular, the rights to a private life and freedom of expression. The necessity of such biometric surveillance is highly questionable, and inherently indiscriminate scanning appears to be plainly disproportionate. As it stands, the risk automated facial recognition is fundamentally incompatible with people's rights under the Human Rights Act 1998 is yet to be considered," the rights group has written.