时间:2025-01-19 02:38:49 来源:网络整理编辑:休閑
Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans
Apple is officially taking on child predators with new safety features for iPhone and iPad.
One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.
So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.
Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.
It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.
Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).
It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.
Tweet may have been deleted
“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
We've reached out to Apple for comment and will update this story when we hear back.
Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.
While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.
SEE ALSO:Apple addresses AirTags security flaw with minor privacy updateIt's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.
Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.
TopicsCybersecurityiPhonePrivacy
Man stumbles upon his phone background in real life2025-01-19 02:37
國足11月7日淩晨開啟西征 部分國腳隔離結束後返俱樂部2025-01-19 01:02
足協杯前瞻:山東晉級在望 上海灘雙雄欲會師下半區2025-01-19 00:48
U22男足名單:陶強龍周俊辰領銜 吳少聰阿布拉汗在列2025-01-19 00:47
You will love/hate Cards Against Humanity's new fortune cookies2025-01-19 00:42
How to watch UFC 297 livestreams: schedule, streaming deals, and more2025-01-19 00:40
利物浦舊將:反感弗格森去曼聯基地 對索帥是災難2025-01-19 00:31
記者:博格巴上場前還被提醒別做蠢事 比如被罰下2025-01-19 00:26
The Weeknd teases new music in Instagram post2025-01-19 00:13
巴薩迫切希望清洗四大隊長 視其為重建之路的障礙2025-01-19 00:10
Nate Parker is finally thinking about the woman who accused him of rape2025-01-19 02:24
定了!哈維下周中入主巴薩 11月6日首戰PK塞爾塔2025-01-19 02:23
目前暫無俱樂部表態退出中超 不欠薪球隊已是極個別2025-01-19 02:07
巴黎前瞻:梅西苦等法甲首球 姆巴佩領多主力缺陣2025-01-19 02:01
This app is giving streaming TV news a second try2025-01-19 01:42
豪門翻車之夜 !拜仁慘敗曼城遭淘汰 皇薩文皆未取勝2025-01-19 00:42
搶手!英超多隊冬窗有意阿紮爾 皇馬標價5000萬歐2025-01-19 00:40
金球獎最新賠率:萊萬反超梅西排第一 薩拉赫第三2025-01-19 00:35
Chinese gymnastics team horrifies crowd with human jump rope2025-01-19 00:30
山東泰山VS武漢隊首發 :莫伊塞斯領銜 德爾加多登場2025-01-19 00:03