时间:2025-03-01 00:13:15 来源:网络整理编辑:探索
The hottest memes of 2020! The most viral videos of the year! The hottest hashtags!The year-in-revie
The hottest memes of 2020! The most viral videos of the year! The hottest hashtags!
The year-in-review lists from Facebook, YouTube, and Twitter were missing the biggest story of the year: their failure to stop the spread of misinformation.
The QAnon conspiracy theory found a shocking amount of mainstream support. Black Lives Matter protests were exploited to spread lies about violence coming to Small Town, USA. And officials had to fight destructive wildfires and false stories about antifascists at the same time.
And, of course, there was the misinformation superspreader event: the COVID-19 pandemic.
The consequences have been dire. But we don't have to let it continue in 2021.
The 2016 U.S. presidential election gave social media platforms a powerful taste of what bad actors could do when weaponizing the tools they created.
Yet, the companies were still hesitant to act. Sure, some nefarioususers were banned. But, as long as the content wasn’t illegal or causing immediate, quantifiable harm ... why would they take action?
Judging by the timing of some of this year’s biggest misinformation-related policy changes, it seems most of the major social media platforms gave themselves a “deadline” to do something before the 2020 U.S. presidential elections.
Then, in March, the coronavirus pandemic hit the U.S. There were anti-mask demonstrations. People drank bleach. And some people refused to believe COVID-19 is real.
“There was just so much conflicting information about the virus and the fact that everyone had time on their hands [due to lockdowns] to actually look at it all,” explained Gita Johar, a professor at the Columbia Business School. “People were sharing everything just trying to make sense of what was going on.”
In a recently published study, Johar found that people who “feel a sense of exclusion and uncertainty,” perpetually or during an unpredictable time, like a pandemic, are more likely to spread what they see on social media.
“In fact, we found that people seem to be able to tell what’s true and false apart, but they still share information regardless,” she said.
And there were plenty of trolls, conspiracy theorists, and politicians willing to flood scared, confused, and angry users with false information.
“All the COVID misinformation actors have to do is sow doubt,” said Imran Ahmed, CEO of the nonprofit Center for Countering Digital Hate. “They adopted the Steve Bannon tactic: ‘flood the zone with shit.’ And that's what we're seeing now. Actors are flooding the zone with nonsense.”
Fadi Quran, campaign director for the nonprofit activist organization Avaaz, which has done extensive research on disinformation online, agreed, saying, “Trump and others within the Steve Bannon network have been pushing claims about voter fraud for years.”
Facebook, too afraid of offending conservatives who accuse the company of having an anti-conservative bias, basically let Trump, Bannon, and others on the right do whatever they wanted.
Big Tech companies took plenty of half-measures this year.
Facebook limited political ads in the week running up to the election. The Trump campaign found a way around this new policy.
Facebook also slapped fact-check labels on rampant misinformation, an approach Ahmed called "disastrous."
“That's what the social media companies want," he said. "They want the debate on the platform.”
That's because the more time a user spends interacting with content, no matter how false or toxic it is, the more opportunities there are to serve that user ads.
“We know the people behind this misinformation and we know that what they're saying is untrue,” Ahmed continued. “Yet for the social media companies, it's an economically productive market for them.”
And there's a lot of content that falls through the cracks on Facebook, a site with nearly 2 billion daily active users. Facebook has around 15,000 content moderators working for the site through third-parties. A NYU report found that Facebook should have double the number of content moderators, who should be in-house employees.
And if Facebook misses something? A MIT study found that users believed that misinformation that hadn't received a fact-check label must be true.
“When it comes to labeling, they did not implement it the ways experts in the field of debunking disinformation recommended that they implement it,” said Quran.
According to Quran, Facebook, for example, doesn’t retroactively “correct the record” for users who saw misinformation before it was fact-checked.
In a report from the Center for Countering Digital Hate, which has studied anti-vaxxers and coronavirus conspiracy theorists, volunteers flagged 912 posts on Instagram, YouTube, and Twitter for misinformation. Only one in 20 were removed.
Social media companies have "been engaged in a process of gaslighting the world with the idea that they're taken these incredible unprecedented measures when in fact they're doing very little beyond spin,” Ahmed said.
In October 2020, however, Facebook did do something major. The company bannedQAnon. It was a welcome move.
But QAnon has existed since 2017 and has sparked numerous incidents of real-life violence. As a major spreader of conspiracy theories during the height of the coronavirus pandemic, imagine how much less shit would have flooded the zone if Facebook took action earlier.
“My estimate is that if Facebook implemented that ban two years before they did, it would have prevented between 5 to 10 million Facebook users from joining QAnon conspiracy groups and pages,” said Quran.
To fix the misinformation epidemic, the social media companies have to want to fix it, something they've shown little appetite for.
“Indiscriminate sharing can become a bad habit,” explained Johar. “Unless we take steps at the source of the problem, which is the supply of misinformation, I don't think that you can just rely on consumers to be able to disentangle and make sense on their own.”
With coronavirus vaccines on the way, attacking this problem has never been more important. Avaaz’s Quran believes that there are some extremely simple steps to curb misinformation, especially when it comes to public health.
One proposal: Facebook should change its algorithm to stop promoting pages that frequently spread misinformation.
“There are repeat misinformers still creating viral content,” he said. “Facebook could easily just stop amplifying them.”
While it’s too late for the social media companies to fix the problems misinformation created in 2020, it’s not too late for 2021.
“Action now could still save lives,” explained Ahmed. “If you take away those voices in the very cynical, very organized, very disciplined anti-vax networks, it would give an opportunity for health authorities to get their message across clearly.”
TopicsSocial Media
Here's George Takei chilling in zero gravity for the 'Star Trek' anniversary2025-03-01 00:01
2.2億大片網絡上線,演員陣容清一色大牌 ,如何回本成為難題2025-02-28 23:51
第58屆韓國電影大鍾獎提名揭曉 湯唯憑《分手的決心》入圍最佳女主角2025-02-28 23:49
鞠婧禕拒絕“顏過其實” 演繹千年蛇妖上古女神2025-02-28 23:34
Aly Raisman catches Simone Biles napping on a plane like a champion2025-02-28 23:31
《彗星來的那一夜》高甜升級 符龍飛湯夢佳開啟“同居”生活2025-02-28 23:16
甜蜜三月 跟著華為視頻看《妻子的浪漫旅行2》2025-02-28 22:56
愛奇藝大年初一上線《渴望》《金婚》《大宅門》等25部經典國劇2025-02-28 22:43
5 people Tim Cook calls for advice on running the biggest company in the world2025-02-28 21:56
第58屆韓國電影大鍾獎提名揭曉 湯唯憑《分手的決心》入圍最佳女主角2025-02-28 21:37
Fake news reports from the Newseum are infinitely better than actual news2025-02-28 23:38
“明星IP”哆啦A夢現身首映禮 《大雄與天空的理想鄉》將與觀眾歡度六一2025-02-28 23:30
《天衣無縫》持續懸疑劇情 莫小棋為戲犧牲剃光頭2025-02-28 23:21
梁洛施曬犯罪動作新片《爆裂點》殺青照 與監製林超賢同框合影2025-02-28 22:38
Sound the alarms: Simone Biles finally met Zac Efron2025-02-28 22:29
Best tablet deals from Amazon's Big Spring Sale 20242025-02-28 22:28
《青蛇:前緣》:催淚虐心愛情,與這個春天有個約定2025-02-28 22:26
Best tablet deals from Amazon's Big Spring Sale 20242025-02-28 22:21
Honda's all2025-02-28 21:43
《小女花不棄》今晚榮耀收官 全員演技在線大獲好評2025-02-28 21:27