时间:2026-03-13 21:38:56 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
Photos show the Blue Cut fire blazing a path of destruction in California2026-03-13 21:22
發燒先消炎還是先退燒2026-03-13 21:15
腋窩有味道但不是狐臭2026-03-13 21:11
微波爐熱板栗怎麽做呢?2026-03-13 21:01
Donald Trump's tangled web of Russian influence2026-03-13 20:51
爬行鍛煉的壞處有哪些 ?2026-03-13 20:19
自製花生芝麻糊的做法2026-03-13 20:17
寶寶睡覺蓋多少被子合適2026-03-13 20:06
Mom discovers security cameras hacked, kids' bedroom livestreamed2026-03-13 19:34
自製花生芝麻糊的做法2026-03-13 19:10
This 'sh*tpost' bot makes terrible memes so you don't have to2026-03-13 21:38
小孩刀切到手指需要打消炎針嗎2026-03-13 21:20
孕酮隻有2點多怎麽辦2026-03-13 21:08
姨媽來了一點又沒有了2026-03-13 21:00
Nancy Pelosi warns colleagues after info hacked2026-03-13 20:13
腋窩有味道但不是狐臭2026-03-13 20:08
世界杯展望 !梅西有望創多項新紀錄!(梅西第一次奪冠是什麽時候)2026-03-13 20:05
痘痘一直複發是什麽原因2026-03-13 19:36
5 people Tim Cook calls for advice on running the biggest company in the world2026-03-13 19:36
14個月寶寶發燒怎麽辦2026-03-13 19:11