AI-Generated YouTube Videos Are Being Used To Spread Malware

AI has made some amazing strides in recent times. You can use freely available online tools to generate scripts, code, letters, poems, and full-blown images based on any subject you can think of. Although it is still relatively early days, AI is rapidly changing the way we operate as a society. While many of AI's effects may be positive, bad actors are also making use of the new tools on offer. Some of them are even using AI to create YouTube videos and trick unsuspecting viewers into downloading malware.

According to CloudSEK, the bulk of people who have been affected by the scam were engaging in shifty practices themselves. Most of the videos promoting malicious links were related to obtaining cracks for software like Photoshop, Premiere Pro, Autodesk 3ds Max, and AutoCAD. This type of scam is also on the rise, with the cyber security firm estimating it increased by 200% to 300% month-on-month since November 2022.

The concept of the scam is simple. An AI model generates a video of a person with a "trustworthy" appearance who provides instructions relating to a particular topic. Those instructions involve downloading something from a link in the video's description. That link contains malicious software capable of stealing things like passwords, credit card details, and files.

YouTube accounts are also being taken over by scammers

More worrying still is the types of accounts this scam is showing up on. Hackers have gained access to popular accounts with over 100,000 subscribers and uploaded videos on them. The topics aren't always related to piracy, so there is a chance completely innocent people could also be targeted. YouTube does eventually help the actual account holder get back into their account and the scammers' videos are taken down. But given the popularity of some of the accounts being targeted, a lot of people could be affected by the time the actual account holder gets back in. 

The scammers have also been observed targeting inactive accounts with a reasonable number of subscribers. This is potentially more of an issue, as it can take a while before the inactive, or lightly active, account holder notices the scammers have hijacked their accounts. As a result, the videos linking to malicious software could be up for a long time. The scammers even add comments from fake accounts to their videos, praising the uploader for providing them with useful advice.

Whilst YouTube is almost certainly taking action, users themselves should take steps to avoid falling victim to these scams. Never click a link if you don't 100% trust it or download a file if you aren't certain about the contents and who is distributing it. As AI improves, these scams will get more sophisticated, so stay vigilant to avoid falling for this or any other internet scams.