Google's DeepMind aims to teach AI to master StarCraft II

DeepMind, Google's artificial intelligence startup, already developed a bot that could beat humans at the ancient Chinese game of Go, but now it's looking to master a much more modern game. The researchers have just announced a new partnership with game developer Blizzard that will see the pair develop a platform to teach AI how to play StarCraft II, the immensely popular real-time strategy (RTS) title.

The new project was announced at Blizzard's own BlizzCon this weekend, and instead of focusing on DeepMind building its own AI to crush human opponents at StarCraft II, it aims to give anyone from hobbyists to scientists a way to build and train their own AI bots on how to play. Blizzard says it will be releasing an API in the coming months, allowing anyone who's interested to participate.

In a blog post announcing the partnership, Oriol Vinyals, a DeepMind researcher (and former top-ranked StarCraft player from Spain), explains why the game, one of the most competitive titles in professional gaming worldwide, was chosen for AI research:

"StarCraft is an interesting testing environment for current AI research because it provides a useful bridge to the messiness of the real-world. The skills required for an agent to progress through the environment and play StarCraft well could ultimately transfer to real-world tasks."

In terms of gaming, this could lead to things like better AI for Starcraft II, improved AI player coaches, and, of course, the possibility of an AI bot that can take on even the best human players. But DeepMind hopes its research will have real-world applications too, such as in the science and energy fields.

Serious StarCraft II fans need not worry just yet, however, as the researchers note they are a long ways off from developing an AI that can take on a top-ranked player.