The AI which had previously taken up this challenge – Elon Musk’s OpenAI – had managed to beat human Dota 2 opponents in August, but did so with significant handicaps. However, it still failed to beat human champions of a more complicated strategy game like Starcraft II despite having learned how to defeat the game’s bots at the highest difficulty level.
READ MORE: SM Mining Data for Facial Recognition AI: Submit Button Means Surrender — Author
Google-owned DeepMind’s AI, which is called AlphaStar, took on Dario “TLO” Wünsch and Grzegorz “LiquidMaNa” Komincz in a series of ten matches, playing five one-on-one matches with each player – and won every single match.
The DeepMind team trained its AlphaStar by making it watch StarCraft II match replays for three days, then broke the AlphaStar AI down into several different distinct versions of itself and forced them to compete in tournaments against each other for seven days.
There were some restrictions during the game, as the AI only learned to play, and play against, one of the three races in the game.
“It was very hard to judge what AlphaStar was doing,” Komincz said on a livestream that showed some of the matches. “It was an incredible experience.”
“Once it started to grasp the basic rules of the game, it started exhibiting amusing behavior such as immediately worker rushing its opponent, which actually had a success rate of 50% against the 'Insane' difficulty standard StarCraft II AI,” Blizzard said of DeepMind at Blizzcon in November.
DeepMind’s AI had already successfully beaten World Go champion Ke Jie back in 2017 in a series of matches, mastering the ancient strategic game.