时间:2025-07-02 10:36:33 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
New Zealand designer's photo series celebrates the elegance of aging2025-07-02 10:16
國足戰日本23人大名單:徐新朱辰傑入選 韋世豪於大寶落選2025-07-02 10:16
拚搏是鐵!國足抗日無需動員 鼓足勇氣發揚鐵的意誌2025-07-02 10:14
魯尼:C羅仍是世界最佳之一 周六要陪孩子不會看比賽2025-07-02 10:00
Florida hurricane forecast remains uncertain, but trends in state's favor2025-07-02 09:57
魯媒:郭田雨可充當國足奇兵 日本隊的壓力遠遠大於國足2025-07-02 09:22
兩場0分0球0射正開局黯淡 越南阿曼也已給國足敲警鍾2025-07-02 09:18
奧斯卡穿中國隊球衣攜愛子為國足助威 :盼夢想成真(圖)2025-07-02 09:09
'Rocket League' Championship Series Season 2 offers $250,000 prize pool2025-07-02 08:38
德國前瞻:弗裏克率隊客戰冰島 欲再掀進攻風暴2025-07-02 08:28
Over 82,000 evacuate as Blue Cut fire rapidly spreads in southern California2025-07-02 10:32
國足對陣越南輸不起 戰勝對手是中國球迷底線要求2025-07-02 09:59
張琳芃傷情無大礙 國足成員:下輪踢越南有信心爭首勝2025-07-02 09:28
李毅複盤中日之戰:若不先丟球 可能會改變比分2025-07-02 09:15
Twitter grants everyone access to quality filter for tweet notifications2025-07-02 09:11
李鐵森保一解除下課危機 國足已盡力的呼聲再起2025-07-02 09:00
直播字幕有誤吳曦下輪不停賽 首輪並未拿到黃牌2025-07-02 08:43
凱恩超歐文獨享隊史第5射手 連續15場預選賽皆破門2025-07-02 08:26
What brands need to know about virtual reality2025-07-02 08:19
U18國青新一期集訓名單 :賈博琰領銜33名球員入選2025-07-02 08:07