时间:2025-04-03 10:40:19 来源:网络整理编辑:百科
Where platforms, social technology, and the internet at large are concerned, over its brief history,
Where platforms, social technology, and the internet at large are concerned, over its brief history, Facebook's been the exception to so, so many rules.
Except for one.
"There’s no silver bullet for abuse. It’s just a constant area of focus. Unfortunately abuse is human nature," Kayvon Beykpour, CEO of Periscope, explained over coffee back in January, when asked point blank about abuse on the platform.
Months later, we now know just how much Facebook has worked to get a handle on abuse, and get it right—and also, just how far they are from that goal. On Sunday, The Guardianpublished a trove of leaked documents from Facebook detailing the social network's moderation guidelines.
The files gave the world a (sometimes-graphic, often uncomfortable) look at how the platform decides where to draw the lines. For example? The statement "To snap a bitch's neck, make sure to apply all your pressure to the middle of her throat" is allowedon the platform, because it's a hypothetical, non-credible, non-specific threat. Meanwhile, photos of animal abuse areallowed, but videos are not, the documents also revealed.
Facebook may be the largest social network in the world, but it's far from the only platform struggling with how to curb abuse and moderate inappropriate content. Each has faced decisions on what to moderate, and how to go about enforcing those policies on networks that number in the millions (and for Facebook, almost billions) of people.
There appears to be a natural evolutionfor a lot of platforms—starting with an embrace of free speech, and then, moving toward the realization that rules are necessary.
In Facebook's early days, it had a light touch. That's pretty easy, of course, when the platform is still just college kids sharing photos with their friends. No news feed, no live video—none of that was there. As Facebook grew into a global phenomenon, and its technology grew with it, the companies learned the hard way that free speech is an important principle, but rules are a necessity.
It's a lesson other platforms have also learned.
Twitter, which held itself up as a platform stridently in favor of free speech, eventually responded to broad outcry over abuse by introducing more stringent rules, as well as new tools related to those rules, both for the reporting of abusive content and ways for users to filter it. Reddit, which built itself on a hardline of free speech, introduced a new content policy in the summer of 2015, and began banning racist communities.
Others have had them from the start—and been open and honest about what they are.
Adi Sideman, CEO of livestreaming platform YouNow, has spoken extensively about moderation, like in this Micprofile of his streaming service. Just this past week, MediumCEO and Twitter cofounder Ev Williams discussed online abuse in a profile on himself in the New York Times.
Meanwhile, on Facebook, we had to amend a lengthy correction to a story about porn and piracy on the platform due to Facebook's own misleading public claims—as well as what they've told me over the phone. And it was hard not to cringe when Facebook CEO Mark Zuckerberg brought up a shooting in Cleveland and said "We need to do better" with policing content during his keynote at Facebook's annual developers conference last month.
In response to some macabre incidents Facebook had to face down, Zuckerberg told the world in April that he's committed to preventing tragedies from happening on his platform. And yet: the platform was deliberately obscuring how moderators decide what's potentially inappropriate, and what's allowed to remain.
That lack of transparency just isn't the case for other networks and platforms, for the most part, and that, to me, is one of the key places where Facebook went wrong. When Twitter cofounder Ev Williams launched blogging site Medium, he released "Medium Rules" the same day.
"Some parts of the Internet lack rules. This isn’t one of them," the post begins.
It's difficult to find when exactly Facebook released standards and rules similar to what Medium did. In March 2015, Facebook rewrote their own. "Today's update to our Community Standards provides more detail on these policies as well as explanations and examples of what isn't acceptable to share on Facebook. Our policies themselves aren't changing," Zuckerberg wrote on Facebook at the time.
Of course, the initial reactions to The Guardian's"Facebook Files" were decidedly critical of the social network. But Sideman of YouNow pointed to the difficulties of scale a network like Facebook, well, faces:
"With [all] the volume levels and the endless possible content scenarios Facebook has, they are not going to get it right every time and questionable policies and bad content will pop up again and again. That should be expected," Sideman explained over email.
Even if Facebook had a perfect rule book, trying to enforce it across the platform's nearly two billion people is a technical problem for which there is no good answer. Technology's being developed to try to understand and decode the actual sentiment behind words, but for the time being, the best way to go about the Sisyphean task of content moderation are still good ol' fashioned human content mods.
That's not a very efficient solution—and it's also one hell of an expensive solution. Still, Facebook doubled-down, and announced that it's hiring another 3,000 moderators to comb through questionable content.
Tweet may have been deleted
And yet: That's not a perfect answer, either.
"Imagine if they get it right 99% of the time," Sideman continued. "It means they get it wrong 1% of the times: that is still tens or hundreds of thousands of bad pieces of content that slip through every day."
And of course, he finished, "these examples will be touted to show how bad of a job Facebook is doing."
So what's a social network with two billion to do? Transparency could help. Maybe for the better, The Guardian just made a big move forward for Facebook.
“Keeping people on Facebook safe is the most important thing we do," Monika Bickert, head of global policy management at Facebook, said in a statement in reaction to The Guardian's expose. "We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously."
Yet, perhaps Facebook should just admit (and be humble about the fact) that they'll never get it right. At least not any time soon.
During my conversation with Beykpour on how they curb abuse at Periscope, he detailed the community management tool where users are asked to vote on whether someone should be removed from the broadcast or not.
"We didn’t want to play God. It’s just not scalable for us to play God. The number of languages, and we can’t be in every broadcast. It’s not something that we can be involved in. And the context is really important," Beykpour said.
Sideman also pointed to the power of the users themselves.
"The long term solution is a combination of technology and harnessing the user base to help in moderation and community management," Sideman wrote.
Whether or not the future is best left in the hands of the users—and a crowdsourced governance of sorts—is a story we're only going to fully understand in the years to come.
But in this moment, it's getting harder and harder not to wonder: Is any system for deciding the future of abusive content on the internet substantially better than the one we have in place?
TopicsFacebookSocial Media
Wikipedia co2025-04-03 09:55
正月初九足球賽事解析,西漢姆保級壓力大 ,黃潛有望拿下歐冠資格2025-04-03 09:52
全明星首發公布 !老詹曆史第一,登帝均落選引熱議...(全民星NBA2021)2025-04-03 09:50
現在 NBA 還有誰有可能拿到三萬分?(詹姆斯得分會超過第一)2025-04-03 09:49
Early Apple2025-04-03 09:46
足球——英超聯賽 :熱刺勝曼城2025-04-03 09:39
姆巴佩或缺席歐冠與皇馬次回合的比賽(姆巴佩怎麽沒參加歐洲杯)2025-04-03 09:37
姆巴佩閃擊馬爾基尼奧斯破門後傷退 巴黎暫22025-04-03 08:41
Tyler, the Creator helped Frank Ocean celebrate 'Blonde' release in a delicious way2025-04-03 08:19
繼詹姆斯之後,下一個突破三萬分大關的球員是誰?兩人最有可能 !(詹姆斯得分會超過第一)2025-04-03 07:56
Samsung Galaxy Note7 teardown reveals the magic behind the phone's iris scanner2025-04-03 10:26
2023英超聯賽:布萊頓VS伯恩茅斯賽前情報分析2025-04-03 10:18
全明星首發公布!老詹曆史第一 ,登帝均落選引熱議...(全民星NBA2021)2025-04-03 10:15
為重回世界第1 ,真的蠻拚,送蛋強勢斬獲ATP250特拉維夫站開門紅(德約科維奇連奪四大滿貫)2025-04-03 09:58
This weird squid looks like it has googly eyes, guys2025-04-03 09:36
英超第22輪:瓜帥塔帥師徒情深一起失分!曼聯成最大贏家!2025-04-03 09:14
國乒00後造冷門!世界第3一輪遊 ,2位小將轟32025-04-03 09:09
羽毛球泰國公開賽|國羽混雙新搭檔高效起步2025-04-03 08:45
Old lady swatting at a cat ends up in Photoshop battle2025-04-03 08:17
【英超】切爾西vs富勒姆 ,藍軍重磅引援,農場主表現出色2025-04-03 08:00