时间:2024-11-22 01:20:55 来源:网络整理编辑:百科
UPDATE:Jan. 3, 2017, 10:30 a.m. EST:Facebook reached out to Mashable after publication of this story
UPDATE:Jan. 3, 2017, 10:30 a.m. EST:Facebook reached out to Mashable after publication of this story to clarify its policy for handling what is shown to users. While previous statements from Facebook reflected live video is only removed after user flagging and that the team of moderators do not "proactively" review videos, a clarified statement revealed Facebook's team of human moderators will monitor videos that reach a certain threshold of viewership. A Facebook spokesperson declined to disclose what the threshold is and if that threshold matches what is shown on the live map.
Facebook also said it has made some changes in its policy to address sharing static images as live videos. For example, static graphics are still allowed--despite not really being live videos--but are downgraded by the News Feed algorithm.
__
Facebook really, really, really wants you to love live video.
Facebook has started nudging people to not only create live videos, but consume them too. An update automatically pushes more live video to your smartphone screen. A bunch of people have been seeing a message pop up on their mobile News Feed: "Feeling the winter vibes? Go Live and winter it up with friends," the company suggests.
But if you're going live, what type of content are you joining?
SEE ALSO:Facebook rolls out another method to make you watch Live videosFor a snapshot of what's on Facebook Live, check out a map of public live videos, available on the desktop version of Facebook. Click on the glowing blue dots, and you can seemingly teleport yourself across the world. But instead of historic sights or live events, you're more likely to come across flashes of boobs and butts, replete with an audience suggesting what to reveal next, and maybe a pirated TV show or soccer game, too
Tweet may have been deleted
A decent share of those showcased "live" videos aren't even truly live. On Monday, one of the most popular Facebook Live videos was a pirated stream of Tom and Jerry. Other videos showed static images, with prompts for users to vote in polls.
Despite the millions of dollars Facebook spends campaigning for you to go live—even though the product's been slowly rolled out over the last year and a half—the world's largest social network still hasn't employed a solution to effectively block explicit, illegal or misrepresented "live" content from popping up all over the desktop map.
They're still working on it. Joaquin Candela, Facebook's director of applied machine learning, said earlier this month that the company has "an algorithm that detects nudity, violence, or any of the things that are not according to our policies," but it was still in the research stage. Facebook didn't provide Mashablewith further insight on a timeline for deploying the algorithm.
Facebook may be working—however hard, or not—on solutions to flag content right now. But one thing's clear: They're telling Facebook's community more and more to watch live video, and go live, themselves, regardless of Facebook Live's current issues.
Beyond the live video content publicly available on Facebook's world map, Facebook also rolled out a live video tab to their mobile app. That feature surfaces live videos from people you follow, and Pages. But again, it doesn't mean it's video of a high quality, within Facebook's guidelines, or enjoyable for all users.
So if you were, say, under the impression that stripping and posing provocatively in a bed wouldn't be allowed? You're wrong. Turns out that those acts in and of themselves don't violate Facebook's Community Standards if the woman's nipples and vagina are covered.
Facebook Live isn't always safe for work, but Facebook doesn't have a problem with that.
Credit: facebook screenshotFor example, after commenting on a live video of the girl posing in the video above, the stream appeared in the News Feed of my brother-in-law and sister as soon as they opened their apps. Indeed, the News Feed can expose Facebook users to much more than they may wish to see.
Periscope's policies ban sexual nudity. YouNow's policies are clearly written and addressed to both parents and younger viewers on what is allowed, including stripping.
But with Facebook Live, exhibitionists can have a home. Why would a camgirl be on Facebook without a direct revenue model? The video above also included a link to a third-party website that requires a subscription. That would technically be in violation of Facebook's Community Guidelines, but evidently no one reported the video.
And of course, there's always PayPal.
Credit: facebook screenshotPirated content is banned, but within Facebook's current system it isn't efficiently monitored and addressed in real-time. A Facebook Page called Perfect Babies, which requests members to share photos of their children, now repeatedly runs streams of Tom and Jerryepisodes.
Combating porn and piracy clearly is far from a new issue, let alone one exclusive to Facebook. Twitter's Periscope and (the now-defunct app) Meerkat both faced the same problem.
But Facebook looks like they're going heavy on marketing and promoting live video, while remaining light on progress where technology and editorial curation are concerned.
People have been employing Facebook Live for provocative use cases as soon as they possibly could. For example, in May, three teens allegedly streamed sexual acts, publicly, to the network. Still, we've learned Facebook's policies very much leave its users exposed to explicit content.
Facebook would prefer we call them "reviewers"
Facebook calls itself a tech company, but it doesn't use software to moderate pornographic or violent videos playing live on the network (to a potential daily audience of more than 1 billion users). Instead, it relies on human viewers and human "editors"—though Facebook would prefer we call them "reviewers"—to monitor, flag and take action against content in violation of its editorially chosen policies.
Over email, a Facebook spokesperson explained the company's stance on violent or pornographic live content: "We don’t use any systems to filter out certain videos. We are not proactively reviewing content and depend on our community to report violations."
It does, however, have a system called Rights Manager, where copyright holders can see and flag a live video, and then, request it to be taken down.
Facebook's actions, or rather inactions, aren't illegal. It follows the Digital Millennium Copyright Act, which states that a rights holder must submit a takedown notice to a service provider for a video to be removed. It also allows explicit content. So does, at least for now, Facebook, as long as nobody reports it.
That's an issue for Facebook as it pushes Live as its next big thing. It's telling you to go live from its ads on television, on billboards, and on your News Feed. It's even paying celebrities and publishers—including Mashable—to go live. And yet, it hasn't prioritized quality.
Then again, it's not that surprising. The issue follows in step with Facebook's tendency to lack immediate quality assurance and transparency. It's worth remembering that Live isn't Facebook's only product with a porn problem—remember the launch of Marketplace?
The technology to detect what's in live video does exist, whether deployable from Facebook's codebase or not. Other live video providers including Twitter's Periscope and YouNow told usthey employ both software and human reviewers to moderate video content.
"We use a combo of tools and humans who validate if content violates guidelines before anything is removed,"a Twitter spokesperson said.
"YouNow uses both automated and manual systems to detect undesirable content from our service, and to bring this content to the forefront so that our 24/7 moderation team may review it for possible removal," wrote a YouNow spokesperson over email.
Twitter said it still requires a human reviewer to remove a live video from Periscope, but the tech does help for flagging. YouNow, however, will let its software remove content.
"We believe it is important to use human judgment to review content that is violating our policies. That being said, in certain circumstances where we are certain that content is in violation of our Terms of Use or community guidelines, this content may be removed prior to a human approving the removal," a YouNow spokesperson wrote.
And yes, live video detection has already been built by third-party companies. For example, Dextro, a computer vision company that worked with Periscope (prior to being owned by Twitter) and YouNow, has software to detect objects such as guns.
Meanwhile, Facebook, one of the world's most valuable technology giants, doesn't use software to remove potentially terms-violating videos. According to what Facebook told us, the company relies on you and a rights holder to alert them, and then, a human to make the final judgment to remove it.
Why doesn't Facebook do the same?
"Given the level of resources [Facebook] has, it's surprising," said David Luan, cofounder of Dextro. "I think what must be the case is there's some sort of specific set of product requirements they have to get right."
Perhaps Facebook's process is fast — even faster than Twitter's or YouNow's — without the tech layer. Facebook declined to disclose how big its moderation team is.
Facebook also holds an advantage over other networks within its real name policy. While people can expose themselves or share pirated content somewhat anonymously on Periscope, Facebook makes it more difficult to create fake accounts by requiring authentic names.
Interestingly—despite not actively using tech to remove explicit live video content automatically—Facebook has repeatedly blamed technicalglitches for several instances of live video being taken down. For example, the company said a "technical glitch" led to the temporary removal of a video that captured the last moments of Philando Castile after being shot by a police officer.
"It only takes one report for something to be reviewed," Facebook said in the blog post it released after the incident with the video of Castile.
Facebook's built and deployed a system to address copyright infringements. In April (the same month Facebook Live became available to everyone) Facebook introduced Rights Manager, where publishers can upload their content to Facebook's system. A system called "Audible Magic" detects video and audio within live and uploaded videos. If something matches, the live video can be reported and then removed.
YouTube, the older video giant, has had a similar system called Content ID, but not only does it help keep copyright holders at ease, it also helps them make money.
The Financial Timesreported Wednesday that Facebook was "working to build out a copyright tracking technology" similar to the one used by YouTube. Part of this technology already exists through Audible Magic, but according to the report, Facebook aims to allow rights holders to not solely request for content to be removed but to earn revenue from licensing agreements.
For example, after commenting on a live video of the girl posing in the video above, the stream appeared in the News Feed of my brother-in-law and sister as soon as they opened their apps.
Facebook declined to comment on the FTreport or answer further questions about its work building a new system, because the company doesn't comment on speculation. The company confirmed it's dedicating more resources, such as making more hires, to address copyright issues.
A Facebook spokesperson said that outside of "always working to make improvements," the company has nothing to announce right now.
Beyond clear violations and questionable material, there's little discoverability.
As a Facebook spokesperson told us—and as the content evidently shows—there aren't any editors. That's a far cry from Periscope, which hired an editor-in-chief, and YouNow, which also has a team that chooses editor's picks. Meanwhile, Facebook's the company that laid off the journalists that once helped manage trending news topics and, at times, selected live videos to represent them.
At the moment, Facebook is on the hunt for a head of news, but it remains to be seen how much power that person will have over what is shown live—or really, over any product Facebook has, or will have.
TopicsFacebookSocial Media
5 people Tim Cook calls for advice on running the biggest company in the world2024-11-22 01:15
Apple offers free fixes for iPhone 12 and 12 Pros with sound problems2024-11-22 01:14
10 best Spanish2024-11-22 00:50
Whistleblower Frances Haugen to brief the Facebook Oversight Board2024-11-22 00:46
Hiddleswift finally followed each other on Instagram after 3 excruciating days2024-11-22 00:13
Rivian R1T first reviews are in: A quick, quiet electric truck made for exploring nature2024-11-22 00:11
Meet the designer who makes high2024-11-22 00:08
How to filter abusive comments and direct messages on Instagram2024-11-22 00:07
One of the most controversial power struggles in media comes to a close2024-11-21 23:23
Let Aiden Arata, the meme queen of depression Instagram, take you through guided meditation2024-11-21 23:01
Chinese gymnastics team horrifies crowd with human jump rope2024-11-22 01:19
Every rover, ranked by distance traveled on the moon and Mars2024-11-22 00:29
Two galaxies create an eye2024-11-22 00:26
How to watch 'Street Gang,' the new Sesame Street documentary2024-11-22 00:22
Old lady swatting at a cat ends up in Photoshop battle2024-11-22 00:13
Facebook strips its name from its own VR platform. Gee, wonder why.2024-11-21 23:42
NASA's Hubble Space Telescope spots a gem of a spiral galaxy2024-11-21 23:41
How to record a Skype call2024-11-21 23:09
PlayStation Now game streaming is coming to PC2024-11-21 22:56
How to create a custom emoji in Slack2024-11-21 22:40