Facebook plans to fight fake news and spam sites by filtering out chronic over-sharers, who – so the social network’s research suggest – routinely end up filling your wall with clickbait. It’s the latest in the company’s attempts to cut down on the sort of often-baseless sensationalism that was particularly highlighted as a problem in the run-up to the US election last year. Now, Facebook will shape its news feed algorithm to reduce the number of low-quality links you see.
All Facebook users probably have at least one among their “friends”: the compulsive over-sharer. Pushing out dozens of links every day, many of questionable accuracy, they not only contribute to the spread of misinformation but sap Facebook itself of some of its usefulness. That can mean people coming back and checking their account less frequently, something the company is unsurprisingly averse to.
Now, it says it has identified who those people are. “Our research shows that there is a tiny group of people on Facebook who routinely share vast amounts of public posts per day,” Adam Mosseri, VP of the Facebook News Feed, says, “effectively spamming people’s feeds. Our research further shows that the links they share tend to include low quality content such as clickbait, sensationalism, and misinformation.”
“As a result,” Mosseri concludes, “we want to reduce the influence of these spammers and deprioritize the links they share more frequently than regular sharers.”
The decision shouldn’t impact most users or publishers, Facebook insists, just as long as they’re sharing quality content. It’ll also only apply to links that are shared on the site, and then specifically to those leading to individual articles. General website domains, along with Facebook Pages, videos, photos, check-ins, and status updates will not be included in the changes.
It’s not Facebook’s first push to control the tenor of what gets shared. Back in May, the company tweaked its algorithms to reduce the number of links shown to “sensational,” “misleading,” or “spammy” sites. To do that, it first assessed hundreds of thousands of sites to figure out their accuracy, while also filtering through adverts that could be considered “disruptive, shocking or malicious”.
An artificial intelligence was taught the results, and then put in charge of managing what links get through and what don’t. However, Facebook has been cautious not to appear too strict with the systems, presumably in the hope of avoiding cries of censorship.