- Sputnik International, 1920
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Apple's New Scanner Looking for Child Abuse Content Raises Concerns Around Users' Privacy

CC0 / / A girl using her mobile phone
A girl using her mobile phone - Sputnik International, 1920, 06.08.2021
Subscribe
The software called “neuralMatch” will detect images falling under the category of child pornography and will subsequently send it to be reviewed by a human.
Apple will allow a tool designed to detect known images of child sexual abuse to scan photos uploaded to the iCloud by US users.
If a case of child sexual abuse imagery is confirmed, the National Center for Missing and Exploited Children (NCMEC) will be notified of the user’s account.
“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” the tech giant has said
Apple will introduce the safety features in three areas, including when parents will be more informed about their children’s internet navigation. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple, the tech giant explained.
John Clark, the president and CEO of NCMEC, called Apple’s expanded protection for children “a game changer”.
“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material. At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known," Clark said
Despite the words of praise for “neuralMatch”, the technology has been criticised for possible violations of user privacy.
Experts from John Hopkins University and Stanford University have discussed their privacy disruption concerns as a result of Apple’s scanning tech.
They argued that the system could get expanded to scan for images unrelated to child abuse, which could be violating the privacy of Apple users.
Parents, who store images of their children taking a bath, for example, where an element of nudity is present, could also be concerned with the software detecting their photos as abuse content.
The new features will be introduced by Apple later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
“This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time,” Apple said.
Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала