https://sputnikglobe.com/20210806/apples-new-scanner-looking-for-child-abuse-content-raises-concerns-around-users-privacy-1083541532.html
Apple's New Scanner Looking for Child Abuse Content Raises Concerns Around Users' Privacy
Apple's New Scanner Looking for Child Abuse Content Raises Concerns Around Users' Privacy
Sputnik International
The software called “neuralMatch” will detect images falling under the category of child pornography and will subsequently send it to be reviewed by a human. 06.08.2021, Sputnik International
2021-08-06T09:34+0000
2021-08-06T09:34+0000
2022-03-01T13:31+0000
us
world
newsfeed
science & tech
society
apple
privacy
child abuse
https://cdn1.img.sputnikglobe.com/img/107287/14/1072871405_0:99:1921:1179_1920x0_80_0_0_e81be2e28ac4198c866ef78cd0a02a3a.jpg
Apple will allow a tool designed to detect known images of child sexual abuse to scan photos uploaded to the iCloud by US users.If a case of child sexual abuse imagery is confirmed, the National Center for Missing and Exploited Children (NCMEC) will be notified of the user’s account.Apple will introduce the safety features in three areas, including when parents will be more informed about their children’s internet navigation. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple, the tech giant explained.John Clark, the president and CEO of NCMEC, called Apple’s expanded protection for children “a game changer”.Despite the words of praise for “neuralMatch”, the technology has been criticised for possible violations of user privacy.Experts from John Hopkins University and Stanford University have discussed their privacy disruption concerns as a result of Apple’s scanning tech.They argued that the system could get expanded to scan for images unrelated to child abuse, which could be violating the privacy of Apple users.Parents, who store images of their children taking a bath, for example, where an element of nudity is present, could also be concerned with the software detecting their photos as abuse content.The new features will be introduced by Apple later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
Sputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
2021
Sputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
News
en_EN
Sputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
https://cdn1.img.sputnikglobe.com/img/107287/14/1072871405_108:0:1811:1277_1920x0_80_0_0_fb515a31ffdc14cce5bb2005a726d47e.jpgSputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
us, newsfeed, science & tech, society, apple, privacy, child abuse
us, newsfeed, science & tech, society, apple, privacy, child abuse
Apple's New Scanner Looking for Child Abuse Content Raises Concerns Around Users' Privacy
09:34 GMT 06.08.2021 (Updated: 13:31 GMT 01.03.2022) The software called “neuralMatch” will detect images falling under the category of child pornography and will subsequently send it to be reviewed by a human.
Apple will allow a tool designed to detect known images of child sexual abuse to scan photos uploaded to the iCloud by US users.
If a case of child sexual abuse imagery is confirmed, the National Center for Missing and Exploited Children (NCMEC) will be notified of the user’s account.
“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” the tech giant has said.
Apple will introduce the safety features in three areas, including when parents will be more informed about their children’s internet navigation. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple, the tech giant explained.
John Clark, the president and CEO of NCMEC, called Apple’s expanded protection for children “a game changer”.
“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material. At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known," Clark said.
Despite the words of praise for “neuralMatch”, the technology has been criticised for possible violations of user privacy.
Experts from John Hopkins University and Stanford University have discussed their privacy disruption concerns as a result of Apple’s scanning tech.
They argued that the system could get expanded to scan for images unrelated to child abuse, which could be violating the privacy of Apple users.
Parents, who store images of their children taking a bath, for example, where an element of nudity is present, could also be concerned with the software detecting their photos as abuse content.
The new features will be introduced by Apple later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
“This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time,” Apple said.