Instagram users can now flag false content for fact-checker review

Facebook has expanded its Third Party Fact-Checking Program to cover photos and videos shared on Instagram, the company has announced. Under this expansion, Instagram users can now flag content they believe is false, triggering a review from a verified fact-checker who will make the final determination about the content. According to the company, false content will be demoted, but not removed.

The ability to flag false content is arriving for Instagram users located in the United States starting today, but the company will make the same feature available to users in other markets by the end of the month. The new reporting option can be found by tapping the triple-dot menu button and choosing 'It's inappropriate.'

Under that menu, users will see a new option that reads, 'False information.' Choosing this report option will prompt verified fact-checkers to review the information; these individuals will either confirm that it is false or will dismiss the report.

Assuming the content is officially flagged as false, Instagram will allow it to remain live on the service but will not show it on the platform's related hashtag page. In addition, the post will be 'downplayed' on the Explore page, according to Poynter. This will limit an account's ability to promote false content and will reduce the number of Instagram users who see it.

The new reporting option is the latest among a number of efforts Facebook has implemented in order to reduce the vast quantity of false information spread through its social platforms. Such content has been used in an effort to manipulate the public, among other things, in regards to political, social, and healthcare matters, critics say.