时间:2025-02-28 23:38:48 来源:网络整理编辑:百科
Here's a study supported by the objective reality that many of us experience already on YouTube.The
Here's a study supported by the objective reality that many of us experience already on YouTube.
The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests.
In a study with more than 37,000 volunteers, Mozilla found that 71 percent of YouTube's recommended videos were flagged as objectionable by participants. The volunteers used a browser extension to track their YouTube usage over 10 months, and when they flagged a video as problematic, the extension recorded if they came across the video via YouTube's recommendation or on their own.
The study called these problematic videos "YouTube Regrets," signifying any regrettable experience had via YouTube information. Such Regrets included videos "championing pseudo-science, promoting 9/11 conspiracies, showcasing mistreated animals, [and] encouraging white supremacy." One girl's parents told Mozilla that their 10-year-old daughter fell down a rabbit hole of extreme dieting videos while seeking out dance content, leading her to restrict her own eating habits.
What causes these videos to become recommended is their ability to go viral. If videos with potentially harmful content manage to accrue thousands or millions of views, the recommendation algorithm may circulate it to users, rather than focusing on their personal interests.
YouTube removed 200 videos flagged through the study, and a spokesperson told the Wall Street Journalthat "the company has reduced recommendations of content it defines as harmful to below 1% of videos viewed." The spokesperson also said that YouTube has launched 30 changes over the past year to address the issue, and the automated system now detects and removes 94 percent of videos that violate YouTube's policies before they reach 10 views.
While it's easy to agree on removing videos featuring violence or racism, YouTube faces the same misinformation policing struggles as many other social media sites. It previously removed QAnon conspiracies that it deemed capable of causing real-world harm, but plenty of similar-minded videos slip through the cracks by arguing free speech or claiming entertainment purposes only.
YouTube also declines to make public any information about how exactly the recommendation algorithm works, claiming it as proprietary. Because of this, it's impossible for us as consumers to know if the company is really doing all it can to combat such videos circulating via the algorithm.
While 30 changes over the past year is an admirable step, if YouTube really wants to eliminate harmful videos on its platform, letting its users plainly see its efforts would be a good first step toward meaningful action.
TopicsYouTube
Here's what 'Game of Thrones' actors get up to between takes2025-02-28 23:35
卡納瓦羅:在廣州我擅長培養新人 並協助建造了新球場2025-02-28 23:08
鄭智下賽季大概率繼續執教廣州隊 是否會有離隊潮仍未可知2025-02-28 21:49
王燊超 :和泰山的足協杯決賽有點像踢當年的恒大2025-02-28 21:47
17 questions you can answer if you're a good communicator2025-02-28 21:19
西班牙籍助教無緣國足上海期間訓練 或直赴日本客場會合2025-02-28 21:18
向“雙冠王”進軍!泰山隊6日起兵成都 費萊尼曬出征照2025-02-28 21:12
隊報2021最佳陣容:梅西C羅姆巴佩齊落選 英超7人2025-02-28 21:07
PlayStation Now game streaming is coming to PC2025-02-28 21:03
媒體人:“李霄鵬一期”國足集訓名單今明兩天內公布2025-02-28 20:52
Samsung Galaxy Note7 teardown reveals the magic behind the phone's iris scanner2025-02-28 23:34
山東隊6日抵成都賈德鬆能否出戰仍未知 情緒把控成關鍵因素2025-02-28 23:20
西甲最朝氣蓬勃的球隊如何打造 瓦倫西亞告訴你答案2025-02-28 23:18
足協公布2022年國際級裁判員名單 :馬寧、傅明在列2025-02-28 23:17
Honda's all2025-02-28 22:56
譚凱元 :感謝隊內老大哥和偉大的球迷們 2022繼續一起戰鬥2025-02-28 22:54
成都蓉城主帥自信球隊進攻火力 :大連隊也存在缺點2025-02-28 22:49
滬媒:申花被提前打上“失敗”的標簽 選帥應盡早提上日程2025-02-28 22:32
Singapore gets world's first driverless taxis2025-02-28 21:41
曝吉田麻也恐因傷缺席22025-02-28 20:53