Facebook will reduce the reach of Groups spreading misinformation

Facebook has announced some new steps it is taking to address 'problematic content' on its platform, including reducing the reach of Groups that spread misinformation. This restriction will apply to Groups that 'repeatedly' share the content, according to Facebook, which also plans to reduce the amount of low-quality content its users see in their News Feed.

The changes are part of the "remove, reduce, and inform" strategy Facebook introduced in 2016 amid its reckoning with fake news related to the election. A variety of hot button topics remain problematic for the company, including the presence of anti-vaxers who use the platform to spread misinformation related to vaccinations.

As part of its "reduce" approach, Facebook plans to limit the content you'll see in your News Feed from Groups that repeatedly share misinformation. As well, the company is implementing a Click-Gap signal that it says will make sure its users see less of this low-quality content ranking in their News Feeds.

Facebook is also expanding the amount of content that will be reviewed by the Associated Press as a third-party fact-checker, and it is collaborating with outside experts on 'new ways to fight' fake news on its platform, the company said in a statement today.

The News Feed Context Button will soon be expanded to images and Facebook is introducing a new Community Standards section that'll allow users to track its monthly updates. Other updates include adding more data to the Page Quality tab, enabling users to remove their comments and posts from Groups after they leave them, adding Verified Badges to Messenger, and updating the Block feature in its messaging app.

The company's full list of efforts can be found here.