YouTube has been heavily criticized for years over the growing number of faux videos for kids on its platform that show violent, disturbing, or otherwise mature content unsuitable for young audiences. In regards to that, the company quietly announced a policy change on its support forum this week stating that it will remove ‘misleading family content’ going forward, including videos that target ‘younger minors.’
It’s not hard to find these misleading ‘kids’ videos on YouTube — many of them have millions of views and have been the source of certain conspiracy theories. Kids may be drawn to this content due to the use of familiar characters, such as Elsa from Frozen and Spider-Man. Many of these videos are (poorly) animated.
Though they feature friendly kids characters, these videos may present disturbing content, including depictions of injections, traumatic births, death, and other things unsuitable for young viewers. YouTube explicitly targets these videos with its latest policy update; the company provides the following examples of the types of misleading videos it will remove:
– A video with tags like “for children” featuring family-friendly cartoons engaging in inappropriate acts like injecting needles.
– Videos with prominent children’s nursery rhymes targeting younger minors and families in the video’s title, description or tags, that contain adult themes such as violence, sex, death, etc.
– Videos that explicitly target younger minors and families with phrasing such as “for kids” or “family fun” in the video’s title, description and/or tags that contain vulgar language.
The removals won’t apply to vulgar animations that are clearly targeted at adults. YouTube warns that creators should be careful to use tags, descriptions, and titles that make it clear the content is intended for mature audiences. These videos may be age-restricted.