YouTube's war on extremism brings out the big guns

Earlier in the summer, YouTube announced a four-pronged plan to combat online terrorism and the promotion of violent extremism on its platform. The four different strategies YouTube announced include both machine learning algorithms and review by real people, suggesting that YouTube is trying to tackle this issue from multiple angles. YouTube has given us an update on this initiative, telling users how things are going one month later.

In a post to the YouTube Blog, YouTube dived into statistics on how its machine learning systems are doing when it comes to the detection and removal of videos promoting terrorism and extremist-related content. In the past month, YouTube says that its machine learning algorithms have improved to the point where 75% of the videos they remove have been taken down before a single user flagged them.

Not only has accuracy improved, but those algorithms have been able to scale well too, doubling the number of videos they take down as well as the rate they're removed in a month's time. These algorithms should keep getting better as time goes on, but YouTube admits that they'll never be perfect. To fill the gaps, YouTube has partnered with 15 additional NGOs like the Anti-Defamation League and The Institute for Strategic Dialogue. YouTube will consult with these organizations to gain insight on topics like hate speech and extremism.

The third arm of this strategy is likely to be the most controversial. YouTube says that it will soon begin implementing strict penalties to videos that aren't illegal but are nonetheless flagged by the community as promoting hate speech or extremism. While the video in question won't be removed entirely, it will be placed behind an interstitial notification, and it won't appear in recommendations or show things like comments and likes. It's important to note that this feature hasn't been implemented yet, but it will begin appearing on desktop in the next few weeks and mobile afterward.

The final part of this strategy involves early intervention. When people search for designated keywords, they'll be redirected to a playlist of videos "that directly confront and debunk violent extremist messages." As a complement to this bit, YouTube will also begin to "amplify" videos that speak out against extremism.

Some of these policies are almost guaranteed to be controversial among some folks, so we'll see how the process evolves from here. In the meantime, head down to the comments section and tell us what you think of these new policies – do you agree with them, or will they hurt more than they'll help?