Social Media

Facebook Is Taking More Steps To Stop Misinformation And Fake News

For starters, the platform's adding more fact-checkers, a Verified Badge for Messenger and a "Group Quality" feature.

Facebook Is Taking More Steps To Stop Misinformation And Fake News
Videoblocks

Facebook says it's taking new steps to stop misinformation and keep "problematic content" from going viral across its platforms. 

The company has faced criticism for not doing enough to stop the spread of misinformation. Facebook has said its users are increasingly turning to private spaces, like groups and messaging, to share content. That could make it harder to monitor the spread of offensive content and propaganda. 

To help combat the problem, Facebook said in a blog post Wednesday that it's updating its "remove, reduce and inform" strategy. For starters, the platform is launching a new "Group Quality" feature that lets users know what content has been flagged and removed from a group.

It'll also try to fight fake news faster by adding more fact-checkers, which is part of its goal to help users "decide what to read, trust and share." And it's adding "Trust Indicators" that will give users more information when they click the Context Button on English and Spanish content.

Facebook is also adding some new safety and privacy features, including a Verified Badge and a more detailed blocking tool in Messenger, and the ability for users to remove posts and comments from a group even after they've left it.