After Coming Under Intense Criticism Apple Delays Plan to Scan Phones for Child Abuse Images
© AP Photo / Keith SrakociciPhones on display at an Apple store at a shopping mall in Pittsburgh
© AP Photo / Keith Srakocic
Subscribe
Apple had said it had invented a tool which was designed to detect whether child sexual abuse images had been uploaded to the iCloud by users. But critics claimed the software, called neuralMatch, was an invasion of privacy and was prone to error.
After an avalanche of criticism from privacy groups, Apple has backed down and delayed plans to introduce a “child safety feature” on iPhones.
Last month Apple said a tool would check the iPhones and personal computers of customers in the United States to flag up images popular with paedophiles.
Apple, which had planned to roll out the feature for iPhones, iPads, and Macs later this year, had insisted the software would not have flagged up innocent images such as holiday snaps of young children in swimming costumes.
#Apple has postponed the launch of a feature that automatically detects photos containing scenes of child abuse (#CSAM) in #iOS 15.
— Yaroslav Gavrilov (@appletesterrus) September 3, 2021
The company listened to the criticism and promised to refine the algorithm.
Last month, we announced plans to create features designed to help pic.twitter.com/X8vifn2J47
But critics said the idea was the tip of the iceberg and could allow for repressive governments to scan phones for political images and cartoons and could be used as a tool of censorship.
Apple had insisted it would allow security researchers to verify its claims but on Friday, 3 September, the company finally backed down and said it would put its plans on hold.
In a statement Apple said: "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Matthew Green, a cybersecurity researcher at Johns Hopkins University, welcomed the move.
In tweets aimed at Apple, Green wrote: “This isn’t a fancy new touchbar: it’s a privacy compromise that affects 1bn users. Going from scanning nothing (but email attachments) to scanning everyone’s private photo library was an enormous delta. You need to justify escalations like this.”