您的当前位置:首页 >休閑 >【】 正文

【】

时间:2024-11-22 01:31:08 来源:网络整理编辑:休閑

核心提示

In yet another step to combat misinformation on the platform, Facebook is taking the step of calling

In yet another step to combat misinformation on the platform, Facebook is taking the step of calling out pages which repeatedly spread fake news.

If you try to like such a page, you will see a pop up saying that the page has "repeatedly shared false information," and that "independent fact-checkers said the information is false." You will then be presented with a choice of going back to previous page or following the page anyway.

There will also be a "learn more" link which will provide some more info on why this page has been labeled as a such, as well as another "learn more" link which will provide more info on Facebook's fact-checking program.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!
Mashable ImageCredit: facebook

The company also said it would expand penalties for individual Facebook accounts which repeatedly share misinformation, in the sense that other users will see less of them in their News Feed.

Mashable ImageCredit: facebook

Finally, Facebook has redesigned the notifications that pop up when users share content that fact-checkers have labeled as false. The notification will now include the fact-checker's article that explains why the post is misleading, together with an option to share that article. Users will also be notified that posts from users who repeatedly share fake news will be positioned lower in the News Feed, making it less likely for other users to see them.

SEE ALSO:Facebook's Oversight Board upholds Trump's suspension

In the past couple of years, Facebook has been introducing a number of measures to combat misinformation on the platform. These include introducing message forwarding limits on Messenger, encouraging users to read an article before sharing it, putting warning labels on fake news, and -- most famously -- blocking Donald Trump from using the platform. Despite these efforts, the company still has a long way to go before it can say it's really gotten rid of fake news.

TopicsFacebookSocial Media