时间:2025-12-13 18:55:15 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
Whyd voice2025-12-13 18:47
埃梅裏 :我們需要一場完美比賽 努力挖利物浦漏洞2025-12-13 18:45
名記:C羅目前尚未考慮未來 他將先與滕哈格對話2025-12-13 18:29
泰山5大國字號球員歸隊 4日與青島海牛進行熱身賽2025-12-13 17:41
Early Apple2025-12-13 17:39
鄭智宣布退役隻是時間問題 已拿到亞足聯B級教練證2025-12-13 17:39
曝滕哈格圈定曼聯8大基石:C羅未知 欲拯救馬奎爾2025-12-13 17:31
滬媒:新外援已經入境 海港至今無法正常訓練2025-12-13 17:30
Dressage horse dancing to 'Smooth' by Santana wins gold for chillest horse2025-12-13 16:43
加盟皇馬倒計時?姆巴佩家中曝光 已經打包好行李2025-12-13 16:18
Darth Vader is back. Why do we still care?2025-12-13 18:51
利物浦今夏第一簽獲確認 免簽英冠冠軍隊19歲天才2025-12-13 18:51
利物浦索要每年8000萬鎊球衣讚助 將成為世界第一2025-12-13 18:40
投彈手!巴西神鋒腦子瓦特了 撿起信號彈炸自家球迷2025-12-13 18:34
Nate Parker is finally thinking about the woman who accused him of rape2025-12-13 18:22
迷幻!球迷曬C羅“還沒結束”遭CR7否認 :沒說過2025-12-13 18:15
中超開賽方案尚未提交管理部門審批 何時開賽仍無定論2025-12-13 18:01
曼聯舊將:C羅仍是梅羅間的No.1 質疑他的人瘋了2025-12-13 17:57
Teacher absolutely nails it with new homework policy2025-12-13 17:04
迷幻!球迷曬C羅“還沒結束”遭CR7否認 :沒說過2025-12-13 16:18