Tuesday 15 March 2016

Microsoft's New Software Turns Minecraft into A Testing Ground for AI


Artificial intelligence has beaten the ancient board game of Go, but can it master Minecraft? Researchers from Microsoft are opening up the game to computer scientists who want to train up AI programs.

Using games to test an AI's ability to teach itself a set of rules is an established approach, but most games used so far have been fairly simple. In February last year, for example, the Google-owned DeepMind (the same company that's tackled Go) created an AI that taught itself to play 49 Atari games. This was an incredible achievement, but the games involved were basic, including titles like Breakout that required the AI to only master a few rules (hit blocks; get a high score) in a two dimensional world.

At the time, DeepMind acknowledged that the next step would be to move onto more complex games, and Microsoft is allowing exactly the sort of next step. "Minecraft is the perfect platform for this kind of research because it’s this very open world," said Katja Hofmann, one of the researchers from Microsoft's Cambridge lab in a press release. "You can do survival mode, you can do ‘build battles’ with your friends, you can do courses, you can implement our own games. This is really exciting for artificial intelligence because it allows us to create games that stretch beyond current abilities."


To allow computer scientists to plug their programs into Minecraft, Hofmann and her colleagues created AIX: an open-source platform that acts as an AI interface for the game. Hoffman says that using this software, researchers will be able to test their AI on all the complex decision-making the game demands. You can tell an AI controlling a character to find the highest spot in the game, for example, and then the computer will have to work out how to navigate the terrain, make weapons, and kill zombies. Just like real life, of course.

Using Minecraft also allows researchers to test AI as if they were operating a physical avatar. "It allows you to have 'embodied AI'," AIX software engineer Matthew Johnson told BBC News. "So, rather than have a situation where the AI sees an avatar of itself, it can actually be inside, looking out through the eyes of something that is living in the world We think this is an essential part of building this kind of general intelligence." The company says this means AIX is a stepping stone to creating the sort of AI-interface that could one day be applied to real-life robots.

Unsupervised learning of the sort that AIX fosters is the next big step for artificial intelligence. Although deep learning systems created by companies like Facebook and Google have proved incredibly successful for commercial applications in recent years, the sort of intelligence they display is narrow. They can understand spoken commands and recognize images, for example, but they still make common sense mistakes. Tools like AIX will help create better methods for AI to teach themselves, and, subsequently, start to learn what the whole world is like, not just a small slice of it.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.