YouTube has been the source of a growing online controversy revolving around videos that seem to inappropriately target children and, more worrisome, the presence of some predatory comments aimed at children. Amid the criticism — but not resulting because of it, according to YouTube — the company tweaked the service so that flagged videos wouldn’t appear in the YouTube Kids app. That wasn’t adequate enough, though, so now it is back with several more changes.
Some criticism directed toward YouTube is the presence of predatory comments left on videos featuring children, some of those videos being suggestive in ways that are highly inappropriate, others being ordinary videos that some users were leaving predatory comments on.
YouTube has started pulling some of the more controversial ones that have been highlighted by concerned online critics, and it says it will take a more aggressive stance toward predatory comments.
YouTube has never allowed predatory comments in its video section, but that hasn’t stopped them from being made and, in many cases, remaining live on the service. “Starting this week,” YouTube said in a blog post today, “we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
In addition, YouTube says it will soon also provide “a comprehensive guide on how creators can make enriching family content,” a move that may aim to cut down on the bizarre videos seemingly targeting kids that are being referred to online as “ElsaGate” content.
As well, YouTube says it is pulling advertisements from videos that target families but that contain inappropriate content, such as offensive themes, violent content, and similar. This counts “even if done for comedic or satirical purposes,” according to YouTube. Though it doesn’t go into many details, the company says it has “further strengthened the application of” this policy that went live back in June.
In the last couple of weeks we expanded our enforcement guidelines around removing content featuring minors that may be endangering a child, even if that was not the uploader’s intent. In the last week we terminated over 50 channels and have removed thousands of videos under these guidelines, and we will continue to work quickly to remove more every day.
Does this mean all the bizarre videos featuring cartoon characters are going to disappear from the site? No. It’s not that simple, says YouTube. The company says that some content is “more nuanced” and that it is harder to make a decision on it. However, YouTube will more aggressively approach this content, explaining:
To help us better understand how to treat this content, we will be growing the number of experts we work with, and doubling the number of Trusted Flaggers we partner with in this area.