时间:2025-04-03 17:07:47 来源:网络整理编辑:娛樂
The hottest memes of 2020! The most viral videos of the year! The hottest hashtags!The year-in-revie
The hottest memes of 2020! The most viral videos of the year! The hottest hashtags!
The year-in-review lists from Facebook, YouTube, and Twitter were missing the biggest story of the year: their failure to stop the spread of misinformation.
The QAnon conspiracy theory found a shocking amount of mainstream support. Black Lives Matter protests were exploited to spread lies about violence coming to Small Town, USA. And officials had to fight destructive wildfires and false stories about antifascists at the same time.
And, of course, there was the misinformation superspreader event: the COVID-19 pandemic.
The consequences have been dire. But we don't have to let it continue in 2021.
The 2016 U.S. presidential election gave social media platforms a powerful taste of what bad actors could do when weaponizing the tools they created.
Yet, the companies were still hesitant to act. Sure, some nefarioususers were banned. But, as long as the content wasn’t illegal or causing immediate, quantifiable harm ... why would they take action?
Judging by the timing of some of this year’s biggest misinformation-related policy changes, it seems most of the major social media platforms gave themselves a “deadline” to do something before the 2020 U.S. presidential elections.
Then, in March, the coronavirus pandemic hit the U.S. There were anti-mask demonstrations. People drank bleach. And some people refused to believe COVID-19 is real.
“There was just so much conflicting information about the virus and the fact that everyone had time on their hands [due to lockdowns] to actually look at it all,” explained Gita Johar, a professor at the Columbia Business School. “People were sharing everything just trying to make sense of what was going on.”
In a recently published study, Johar found that people who “feel a sense of exclusion and uncertainty,” perpetually or during an unpredictable time, like a pandemic, are more likely to spread what they see on social media.
“In fact, we found that people seem to be able to tell what’s true and false apart, but they still share information regardless,” she said.
And there were plenty of trolls, conspiracy theorists, and politicians willing to flood scared, confused, and angry users with false information.
“All the COVID misinformation actors have to do is sow doubt,” said Imran Ahmed, CEO of the nonprofit Center for Countering Digital Hate. “They adopted the Steve Bannon tactic: ‘flood the zone with shit.’ And that's what we're seeing now. Actors are flooding the zone with nonsense.”
Fadi Quran, campaign director for the nonprofit activist organization Avaaz, which has done extensive research on disinformation online, agreed, saying, “Trump and others within the Steve Bannon network have been pushing claims about voter fraud for years.”
Facebook, too afraid of offending conservatives who accuse the company of having an anti-conservative bias, basically let Trump, Bannon, and others on the right do whatever they wanted.
Big Tech companies took plenty of half-measures this year.
Facebook limited political ads in the week running up to the election. The Trump campaign found a way around this new policy.
Facebook also slapped fact-check labels on rampant misinformation, an approach Ahmed called "disastrous."
“That's what the social media companies want," he said. "They want the debate on the platform.”
That's because the more time a user spends interacting with content, no matter how false or toxic it is, the more opportunities there are to serve that user ads.
“We know the people behind this misinformation and we know that what they're saying is untrue,” Ahmed continued. “Yet for the social media companies, it's an economically productive market for them.”
And there's a lot of content that falls through the cracks on Facebook, a site with nearly 2 billion daily active users. Facebook has around 15,000 content moderators working for the site through third-parties. A NYU report found that Facebook should have double the number of content moderators, who should be in-house employees.
And if Facebook misses something? A MIT study found that users believed that misinformation that hadn't received a fact-check label must be true.
“When it comes to labeling, they did not implement it the ways experts in the field of debunking disinformation recommended that they implement it,” said Quran.
According to Quran, Facebook, for example, doesn’t retroactively “correct the record” for users who saw misinformation before it was fact-checked.
In a report from the Center for Countering Digital Hate, which has studied anti-vaxxers and coronavirus conspiracy theorists, volunteers flagged 912 posts on Instagram, YouTube, and Twitter for misinformation. Only one in 20 were removed.
Social media companies have "been engaged in a process of gaslighting the world with the idea that they're taken these incredible unprecedented measures when in fact they're doing very little beyond spin,” Ahmed said.
In October 2020, however, Facebook did do something major. The company bannedQAnon. It was a welcome move.
But QAnon has existed since 2017 and has sparked numerous incidents of real-life violence. As a major spreader of conspiracy theories during the height of the coronavirus pandemic, imagine how much less shit would have flooded the zone if Facebook took action earlier.
“My estimate is that if Facebook implemented that ban two years before they did, it would have prevented between 5 to 10 million Facebook users from joining QAnon conspiracy groups and pages,” said Quran.
To fix the misinformation epidemic, the social media companies have to want to fix it, something they've shown little appetite for.
“Indiscriminate sharing can become a bad habit,” explained Johar. “Unless we take steps at the source of the problem, which is the supply of misinformation, I don't think that you can just rely on consumers to be able to disentangle and make sense on their own.”
With coronavirus vaccines on the way, attacking this problem has never been more important. Avaaz’s Quran believes that there are some extremely simple steps to curb misinformation, especially when it comes to public health.
One proposal: Facebook should change its algorithm to stop promoting pages that frequently spread misinformation.
“There are repeat misinformers still creating viral content,” he said. “Facebook could easily just stop amplifying them.”
While it’s too late for the social media companies to fix the problems misinformation created in 2020, it’s not too late for 2021.
“Action now could still save lives,” explained Ahmed. “If you take away those voices in the very cynical, very organized, very disciplined anti-vax networks, it would give an opportunity for health authorities to get their message across clearly.”
TopicsSocial Media
WhatsApp announces plans to share user data with Facebook2025-04-03 17:07
巴黎聯賽大名單:梅西內少姆巴佩領銜 拉莫斯入選2025-04-03 17:05
西媒點評武磊失單刀:錯失了一次殺死比賽的機會2025-04-03 16:32
國足是否換李鐵下月見分曉 陳戌源暫未與傳聞新主帥見麵2025-04-03 16:26
Make money or go to Stanford? Katie Ledecky is left with an unfair choice.2025-04-03 16:24
曝李鐵用國腳資格幫經紀公司置換利益 執教國足仍幹預武漢隊工作2025-04-03 16:17
神操作!中冠總決賽 湖北隊主教練伸腿絆倒對方突破球員(GIF)2025-04-03 15:37
熱刺前瞻:孔蒂鐵腕模式再啟 凱恩能否連場破門 ?2025-04-03 15:37
Michael Phelps says goodbye to the pool with Olympic gold2025-04-03 15:22
曝莫雷諾並未帶走全部行李 未來能否重返中國待定2025-04-03 15:06
This app is giving streaming TV news a second try2025-04-03 16:54
曝李鐵用國腳資格幫經紀公司置換利益 執教國足仍幹預武漢隊工作2025-04-03 16:34
巴黎開1700萬年薪誘維尼修斯 報複皇馬挖角姆巴佩2025-04-03 16:20
曼聯官方 :朗尼克出任臨時主帥 執教至本賽季結束2025-04-03 16:10
Two states took big steps this week to get rid of the tampon tax2025-04-03 16:02
曝李霄鵬無限接近於國足帥位 媒體人:等官宣2025-04-03 15:54
2022亞冠小組抽簽1月13日進行 中超3+1名額2025-04-03 15:36
官方 :熱刺VS伯恩利因大雪被取消 比賽將延期進行2025-04-03 15:29
Tyler, the Creator helped Frank Ocean celebrate 'Blonde' release in a delicious way2025-04-03 15:29
皇馬前瞻 :西甲上演榜首大戰 戰艦新老主帥大對決2025-04-03 14:25