时间:2025-09-16 23:50:57 来源:网络整理编辑:熱點
Pornographic deepfakes of Taylor Swift went viral on X (formerly Twitter) this week, highlighting th
Pornographic deepfakes of Taylor Swift went viral on X (formerly Twitter) this week, highlighting the dangers of AI-generated imagery online.
Synthetic or manipulated media that may deceive people isn't allowed on X, according to its policy, and the platform's safety team posted on Friday that it's "actively removing all identified images and taking appropriate actions against the accounts responsible for posting them."
SEE ALSO:The era of the AI-generated internet is already hereBy Saturday, users noticed that X attempted to curb the problem by blocking "Taylor Swift" from being searched — but not certain related terms, The Verge reported.
Mashable was also able to produce the error page for the terms "Taylor Swift AI" and "Taylor AI." The terms "Swift AI," "Taylor AI Swift," and "Taylor Swift deepfake" are searchable on the platform, though, with manipulated images still displayed on the "Media" tab.
As Mashable culture reporter Meera Navlakha pointed out in an article about the deepfakes of Swift, major social media platforms are struggling to contain AI-generated content. This is due to the speed and access of creating these images, causing social platforms like X to be inundated with them in recent months. Making Swift's name unsearchable suggests that X doesn't know how to handle the array of deepfake imagery and video on its platform.
On Friday, White House press secretary Karine Jean-Pierre called the situation "alarming." She also commented that there should be legislation about it, hinting that the issue of AI image moderation may soon be seen in Congress.
TopicsTwitterTaylor Swift
Give your kitchen sponge a rest on this adorable bed2025-09-16 23:49
North Pole hits melting point in time for Christmas, so Santa can just swim to you now2025-09-16 23:39
Gun violence spreads like an infectious disease, new study finds2025-09-16 22:55
China announces a 'game2025-09-16 22:36
Plane makes emergency landing after engine rips apart during flight2025-09-16 22:24
WhatsApp group administrators not liable for content: Indian court2025-09-16 22:12
The ad block apocalypse didn't arrive and doesn't look like it's going to2025-09-16 22:11
The artist blowing people's minds in virtual reality2025-09-16 21:37
Over 82,000 evacuate as Blue Cut fire rapidly spreads in southern California2025-09-16 21:28
Kylie Jenner konfirms the obvious: The Kardashian apps use ghostwriters2025-09-16 21:17
Australian football makes history with first LGBT Pride Game2025-09-16 23:37
PSA: You can lawfully refuse to pay service charge at restaurants, hotels in this country2025-09-16 23:32
Jury trial to decide whether the 'Star Trek' fan film boldly went too far2025-09-16 23:13
Torrential rain turns Australia's desert heart into a series of waterfalls2025-09-16 23:11
The five guys who climbed Australia's highest mountain, in swimwear2025-09-16 23:04
15 times Carpool Karaoke was the perfect antidote to 20162025-09-16 22:40
Nate Silver sends out apocalyptic tweet just in time for the holidays2025-09-16 22:38
Mr. No Fun Paul Ryan shuts down kid who tries to dab2025-09-16 22:24
Whyd voice2025-09-16 22:00
Redditors share portraits of their dogs posing majestically for the camera2025-09-16 21:43