Researchers at Google’s AI-focused research company DeepMind recently gave a pair of small humanoid robots the tools needed to learn how to play soccer on their own. Using DeepMind’s latest artificial general intelligence, the robots were able to gain the skills necessary to run around kicking the ball and trying to score goals. That’s cool; just don’t let them learn how to play Call of Duty or Counter-Strike.
The robots had to learn everything from the ground up, first figuring out how to control their humanoid bodies and walk, then run, before progressing to kicking the ball. After they had the basics down, they then needed to apply all the learning they’d done to actually play the game. Robotic soccer matches – they’re coming soon.
According to DeepMind research Scientist Guy Lever, “In order to ‘solve’ soccer, you have to actually solve lots of open problems… there’s controlling the full humanoid body, coordination, which is really tough for AGI, and actually mastering both low-level motor control and things like long-term planning.” Long-term planning: personally, I’ve never been good at it. I can barely even plan what to have for dinner, which might explain why 9 nights out of 10, it’s a peanut butter and jelly sandwich on a hotdog bun.