Drones and robots will have enhanced abilities to navigate their environments thanks to a group of Brisbane researchers.
Current technology relies on manual intervention when autonomous vehicles encounter different surroundings, but a team from QUT has developed an automated process.
Lead researcher Dr Alejandro Fontan Villacampa said the automated system would improve how robots mapped and navigated the world by making vision-based mapping systems more adaptable to different environments.
Dr Fontan and colleague Professor Michael Milford will present the development at the Robotics Science and Systems 2024 conference in Delft, the Netherlands, next week.
He said the new system would enhance Visual SLAM, which was a technology that helped devices like drones, autonomous vehicles, and robots navigate.
“(Visual SLAM) enables them to create a map of their surroundings and keep track of their location within that map simultaneously,” Professor Fontan said.
He said SLAM systems relied on specific types of visual features, distinctive patterns within images used to match and map the environment.
“Different features work better in different conditions, so switching between them is often necessary.
“However, this switching has been a manual and cumbersome process, requiring a lot of parameter tuning and expert knowledge.”
Dr Fontan said QUT’s new system, AnyFeature-VSLAM, adds automation into the ORB-SLAM2 system that was widely used around the world.
“It enables a user to seamlessly switch between different visual features without laborious manual intervention,” he said.
“This automation improves the system’s adaptability and performance across various benchmarks and challenging environments.”