https://sputnikglobe.com/20210824/apple-confirms-it-has-already-been-scanning-icloud-mail-to-combat-child-exploitation---1083697328.html
Apple Confirms It Has Already Been Scanning iCloud Mail to Combat Child Exploitation
Apple Confirms It Has Already Been Scanning iCloud Mail to Combat Child Exploitation
Sputnik International
Eric Friedman, head of Apple's Fraud Engineering Algorithms and Risk unit, puzzled many earlier this month when he claimed that the tech giant has "the... 24.08.2021, Sputnik International
2021-08-24T01:12+0000
2021-08-24T01:12+0000
2021-08-24T01:12+0000
newsfeed
us
world
business
child abuse
child pornography
child exploitation
sexual exploitation
pedophilia
child predator
https://cdn1.img.sputnikglobe.com/img/07e5/03/1e/1082486115_0:160:3073:1888_1920x0_80_0_0_393070d2d2de95f1637e8e83515e0ae0.jpg
As Apple seeks to detect and report potential child abuse imagery, the tech giant also appears to be coming clean about the scope, and duration of its surveillance on users. Apple's email surveillance has been active since 2019, and would apply to those using the Mail app on an iOS-enabled device. The company's admission came in response to 9To5Mac probing the anti-fraud chief's claim that Apple was "the greatest platform for distributing child porn." The outlet, like many privacy advocates, questioned how Apple would know about the distribution of such content without conducting some form of surveillance. Some hundreds of CSAM reports are submitted each year. Based on Apple's response, it is possible the anti-fraud chief was using known data to make inferences about other exploitative content that exists on its platform. The group asserted that the expanded capabilities could be "used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children." "Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," the coalition argued.
Sputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
2021
News
en_EN
Sputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
https://cdn1.img.sputnikglobe.com/img/07e5/03/1e/1082486115_170:0:2901:2048_1920x0_80_0_0_9fea0789732009a852751324fbcb87b5.jpgSputnik International
feedback@sputniknews.com
+74956456601
MIA „Rossiya Segodnya“
newsfeed, us, business, child abuse, child pornography, child exploitation, sexual exploitation, pedophilia, child predator, iphone
newsfeed, us, business, child abuse, child pornography, child exploitation, sexual exploitation, pedophilia, child predator, iphone
Apple Confirms It Has Already Been Scanning iCloud Mail to Combat Child Exploitation
Eric Friedman, head of Apple's Fraud Engineering Algorithms and Risk unit, puzzled many earlier this month when he claimed that the tech giant has "the greatest platform for distributing child porn." His curious statement has provoked several questions for Apple regarding user privacy, phone scans and the details of its new anti-child abuse effort.
As Apple seeks to
detect and report potential child abuse imagery, the tech giant also appears to be coming clean about the scope, and duration of its surveillance on users.
The company confirmed to Apple-centered outlet 9To5Mac that its plan to periodically scan users' iCloud photos and iCloud backups for Child Sexual Abuse Material (CSAM) is only partially new, as Apple - has been routinely scanning both incoming and outgoing iCloud mail for such content.
Apple's email surveillance has been active since 2019, and would apply to those using the Mail app on an iOS-enabled device.
The company's admission came in response to 9To5Mac probing the anti-fraud chief's claim that Apple was "the greatest platform for distributing child porn."
The outlet, like many privacy advocates, questioned how Apple would know about the distribution of such content without conducting some form of surveillance.
Apple did not address Friedman's comments directly, but did highlight the scanning of users' iCloud mail, as well as the scanning of "other data," which does not refer to iCloud backups.
Some hundreds of CSAM reports are submitted each year.
Based on Apple's response, it is possible the anti-fraud chief was using known data to make inferences about other exploitative content that exists on its platform.
Apple's planned rollout of new surveillance capabilities has also been slammed by an international coalition of civil and policy rights groups, including the American Civil Liberties Union, Privacy International, the Electronic Frontier Foundation, and Access Now.
The group asserted that the expanded capabilities could be "used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children."
Concerns about the security of iMessage's end-to-end encryption were also raised.
"Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," the coalition argued.