- About the Center
- Team RIM
- Innovation and Industry
Robots get around by mimicking primates
Robots that visualise their surroundings like primates do can step out into uncharted territory. Give a friend directions, such as, "it's across the street from a petrol station, just after a red brick building on the right..." and you can be pretty sure they'll find what they are looking for. Robots, on the other hand, are hopeless at following such cues, because they can't envision a perspective other than their own. But that's about to change.
By mimicking how primates visualise an unfamiliar environment - a process called mental rotation - researchers are building a new kind of guidance system for robots.
Many species of animals perform mental rotation - a poorly understood aspect of spatial reasoning that is nonetheless an integral part of high-level cognition.
"If I tell you to turn left, you will probably ask whose left, mine or yours?" says Ronald Arkin of Georgia Institute of Technology in Atlanta, who is leading the effort to incorporate this technique into software for controlling robots. "You have to transform your frame of reference," he says.
The team is now testing their software in a lab setting. The researchers first supply the robot with a destination - a simplified image of how objects in their environment will look from a given perspective. The robot then uses depth information from an on-board Kinect motion sensor to establish how objects look in its surroundings.
Once it has built a picture of where it is, the robot "mentally" rotates the orientation of objects to match its destination, and then plots a path. As it trundles along, it continues to take images of its surroundings and compare them to its destination, just to make sure it is on the right track. In tests, a small four-wheeled robot used this method to find its way 6 metres across a lab floor to the right spot.
It's a humble beginning, but Arkin says it's the first time a robot has demonstrated the ability to receive visual instructions and act on them without a map. The work will be presented in December at the ROBIO conference in Guangzhou, China. "When the world isn't as you expect it to be, this will help you," he says, adding that the system could also be adapted to use speech recognition software to understand voice commands and use them to build a picture of the destination being described.
The team's work is taking robotic autonomy into untested waters, says Jeffrey Krichmar, who studies cognitive robotics at the University of California, Irvine. "There has been some work with speech recognition but a robot that can take advice and apply it is a very open area," he says.
Giving robotic vehicles the ability to interpret an outside perspective would greatly improve their ability to navigate in the absence of conventional technologies, like GPS. "We expect this to give a cognitive push to robot navigation. It moves you in the general direction you need to go and then your other systems take over," Arkin says.
Krichmar's group has looked at adapting cognitive processes in rats for robotic navigation. He says that trying to transfer a model of primate cognition into robots is a big challenge, because of the higher degree of complexity.
For his part, Arkin thinks his team's work in robots will lead to a better understanding of why primates have spatial reasoning skills in the first place. "It will not only help us add perspective-taking and advice-taking applications to robots," he says, "but also help us understand processes we humans use every day, but know very little about."
Article can be found at the NewScientist.