Facebook content moderation revealed to be a farce

Facebook has received a lot of flak recently for its mishandling of the Cambridge Analytica data leak. While that particular case has seemingly died down a bit, the social networking giant is far from being out of the fire. A new documentary from Channel 4 Dispatches puts Facebook back into the frying pan, revealing just how much its promises to protect its users from harmful content might mean squat, especially when such responsibilities are outsourced and offloaded onto third parties that might not exactly share Facebook's concerns.

An undercover reporter from Firecrest Films applied as a content moderator at Cpl Resources plc in Dublin, one of the Facebook's content moderation centers and the largest for the UK. The reporter underwent the usual training and orientation that told them what to do when dealing with content that has been reported by users due to their potentially harmful or misleading content. With Facebook always refining its policies after criticisms and scandals, you might think it's a straightforward matter.

Unsurprisingly, it's the exceptions that are now giving Facebook a massive headache. Several types of content that clearly violate Facebook's policies are kept online and simply flagged as "disturbing". The reasons given to trainees vary from the content being used as a "do not do this" warning despite the depiction of graphical violence or because Facebook simply has no legal culpability if the user doesn't admit upfront that he or she is underage. Regardless if said user is show to be harming herself.

The report also quotes Venture Capitalist Roger McNamee, an early Facebook investor and Zuckerberg's mentor, as saying that Facebook's business relies on extreme content. It gets more people on the platform, one way or another and gets them to spend more time on those pages. More eyeballs on ads mean more money for Facebook.

Naturally, Facebook denies such a characterization and insists that its business actually relies on providing a safe place for users to share and interact. To be fair, the questionable practices and statements were not made by Facebook itself but by outsourced companies and third parties. Then again, it also proves how bad it has been at actually implementing its own policies and making good on its promises.