Facebook test warns some users when they’ve seen extremist content

Brittany A. Roston - Jul 1, 2021, 5:47pm CDT
Facebook test warns some users when they’ve seen extremist content

Facebook has rolled out a new test that lets some users know when they’ve been exposed to extremist content, as well as another feature that offers users help in cases where they believe one of their friends may be adopting extremist views. The move comes amid growing concerns that social media platforms are being used to radicalize populations.

Social media platforms connect people together, but that’s not always a good thing. The platforms have become home to increasingly extremist content, including everything from widely shared misinformation designed to manipulate one’s emotions to closed groups that actively spout extreme, harmful views.

Some Facebook users have noticed new prompts that address this problem, including one that alerts them when they’ve been exposed to potentially extremist content and another that offers help if they’re concerned about a friend. The company confirmed the test soon after, saying it is part of a larger effort to test ways the platform can help at-risk users.

A Facebook spokesperson told CNN that the company is working with non-profit organizations and experts on extremism as part of this effort, but that it doesn’t have more to add at this time. Users are reporting two different alerts, including one that reads, “We care about preventing extremism on Facebook. Others in your situation have received confidential support.”

Another message reads, “Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.” This alert lets users know if they’ve viewed content that may be extremist in nature, likewise offering them resources to get support. It’s unclear how many users are seeing the prompts at this time.

Whether the warnings will be enough to reverse the tide of misinformation and extremism on the platform is a bigger question. Facebook has been home to manipulative media for years, having played a big role in the 2016 presidential election, for example. Though Facebook has rolled out methods to address the manipulation on its platform, including putting alerts on fake news, the efforts seem to have had little effect on users who continue to find and share misinformation.


Must Read Bits & Bytes