YouTube loosens its age-restriction policy for violent game content

YouTube has published an update on its policy concerning violence featured in video games, reassuring creators that starting today, most videos featuring violent content from these games won't be age-restricted. The clarification aims to ensure that the company is consistently enforcing its violent and graphic content policies, which forbids certain types of videos from YouTube.

Put simply, YouTube forbids creators from uploading content that features violence or gore that it determines is 'intended to shock or disgust viewers.' The platform also forbids content that encourages acts of violence. However, in some cases, the platform elects to age-restrict content rather than remove it, which means it can stay on the platform but requires the user to sign in to their account to view it.

Under this policy, YouTube says that it may age-restrict 'fictional violence' if it includes any graphic depictions of violence, which could include things like finishing moves in Mortak Kombat or Doom. This is a problem for game creators who may find their videos age-restricted if they include a game scene depicting fictional violence.

In its update today, YouTube said that 'scripted or simulated violent content found in video games will be treated the same as other types of scripted content.' That means any future uploads featuring gaming content may not be age-restricted even if it features simulated or scripted violence.

However, YouTube warns creators that some gaming videos may still end up restricted if the video solely focuses on gory and/or violent imagery. Creators should note that this change doesn't mean the company's ad-friendly guidelines are changing — videos showing violent game content may not appeal to many advertisers.