Robots combine human-like vision and thinking to improve their navigation

Dr Tobias Fischer – Research Fellow, Queensland University of Technology

Robots combine human-like vision and thinking to improve their navigation

Wouldn’t it be great if robots could assist us in our everyday lives? New research shows that robots can use special cameras to recognise familiar places and navigate in the world, much more like human eyes do.

Dr Tobias Fischer, working with colleagues at the QUT Centre for Robotics have now shown that the key to such intelligent robots lies in “combining multiple beliefs of the world”.

Professor Michael Milford (QUT) says that currently, robots “are typically good at a single or very narrow range of tasks, but fail at anything else”.

Dr Fischer uses ‘event’ cameras—ones that only pay attention to changes in a scene, not the whole picture—to mimic how animals and humans think about moving around.

“We can draw on animal behaviour as inspiration to make autonomous systems more capable, creating more sophisticated navigational systems,” Professor Milford says.

Event cameras have outstanding power efficiency, making them highly suitable for mobile robots that operate in remote areas, or even in space.

Dr Fischer works with colleagues including Somayeh Hussaini and Professor Milford

Related Articles