时间:2025-04-02 15:29:06 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
New Zealand designer's photo series celebrates the elegance of aging2025-04-02 15:03
看瞎我的眼 !德佩強突送保姆傳球 德斯特空門打飛2025-04-02 14:54
曼聯球迷強烈要求索帥下課 繼任者孔蒂呼聲頗高2025-04-02 14:18
申花從場均1.2失球到足協杯0失球 約尼奇透露關鍵原因2025-04-02 13:18
You will love/hate Cards Against Humanity's new fortune cookies2025-04-02 13:18
大連VS海港首發:保利尼奧單外援登場 李聖龍衝鋒2025-04-02 13:07
大連人已沒能力爭取更好成績 目標或盡可能多創造比賽機會2025-04-02 13:01
聯賽四連勝終結+五戰首丟球 !奪冠?斑馬仍需努力2025-04-02 12:46
This German startup wants to be your bank (without being a bank)2025-04-02 12:44
旺達官宣與伊卡爾迪複合:我們本已簽了離婚協議2025-04-02 12:44
'The Flying Bum' aircraft crashes during second test flight2025-04-02 15:29
無奈!C羅遭遇慘痛一敗 惱羞成怒狠踢對手三腳染黃2025-04-02 15:20
太粗野!利物浦遭曼聯暴力侵犯 渣叔 :怕報複收著踢2025-04-02 15:18
雪上加霜!博格巴鏟倒凱塔染紅 剛替補登場僅15分鍾2025-04-02 15:18
Michael Phelps says goodbye to the pool with Olympic gold2025-04-02 14:37
深足相較申花平均年齡小4歲 經驗欠缺次回合仍有翻盤機會2025-04-02 14:27
克洛普:C羅的動作應該是紅牌 但我不想他被罰下去2025-04-02 14:17
曼聯VS利物浦首發:C羅搭檔拉什福德 若塔薩拉赫出戰2025-04-02 14:07
Fyvush Finkel, Emmy winner for 'Picket Fences,' dies at 932025-04-02 13:42
粵媒:國足進攻可拷貝俱樂部打法 圍繞四歸化做文章2025-04-02 12:45