时间:2025-04-26 13:25:28 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
There's a big piece of fake chicken stuck to this phone case2025-04-26 12:30
關注量突破500萬 沈月分享心路曆程感恩粉絲2025-04-26 12:29
那英王菲春晚彩排臨陣換歌 將改唱《歲月》2025-04-26 12:26
張傑暈倒風波後繼續完成剩餘工作 曬自拍狀態不錯2025-04-26 11:56
Florida hurricane forecast remains uncertain, but trends in state's favor2025-04-26 11:30
50歲小品演員孫濤近照曝光 被網友封為“凍齡男神”2025-04-26 11:25
沈月《流星花園》劇組慶生 收獲五月天隔空祝福2025-04-26 11:22
菲仕樂與天貓超級品牌日再度聯手,傳統廚具零售創新營銷加碼2025-04-26 11:02
U.S. government issues warning on McDonald's recalled wearable devices2025-04-26 10:48
不忘初心 牢記使命2025-04-26 10:42
Singapore rolls out video2025-04-26 13:14
影橙傳媒小藝人張榿凡參與拍攝視頻廣告2025-04-26 13:12
MJ娛樂聯手抖音打造上海達人站 吳佳煜、張欣堯等大咖齊聚2025-04-26 12:51
金曲歌後艾怡良三等車廂全國巡演完美收官2025-04-26 12:32
This German startup wants to be your bank (without being a bank)2025-04-26 12:18
原萬達影業副總杜聞偉應邀擔任中國電視藝術家協會中華傳統文化影響傳播委員會副主任2025-04-26 12:09
三年上市 :李紀豐的影視快線2025-04-26 12:01
郭德綱夜脫口秀中大談夜生活,隻為風月 ,不點外賣2025-04-26 11:49
Give your kitchen sponge a rest on this adorable bed2025-04-26 11:21
吳倩新劇《戀愛的夏天》正式官宣定檔 !與魏晨新劇僅隔1天開播2025-04-26 10:54