At the WAVELab, Waslander and his team focus on outdoor autonomy for aerial and ground vehicles, and how they can advance robot perception, planning, coordination, and control for new applications. The team conducts a variety of research projects through the lab and has completed experiments involving quadrotor flight control in the presence of winds, visual Simultaneous Localization And Mapping (SLAM), laser scan registration, and autonomous driving.
The lab is organized around three research topics: aerial vehicles, ground rover vehicles and autonomous driving. Lately, the team has been looking for a way to replace expensive laser scanner sensors with vision systems that perform the same function, Waslander said. Laser scanners are known for providing high-quality data of the environment around robots as they move, making it easy for them to travel safely without bumping into objects. But these laser scanners are very expensive—about $70,000 each—and a hindrance to smaller vehicles.
To make safe movement more economical, the WAVELab team is developing new state-of-the-art robot perception algorithms, Waslander said. As the “ground truth” for their research, the WAVELab uses NovAtel’s OEM615™, a dual-frequency GNSS receiver card that can track all four satellite systems—GPS, GLONASS, Galileo, and BeiDou.
NovAtel’s RTK system employs a fixed base station that broadcasts RTK data to the aerial vehicle, Waslander said. This determines the vehicle location, relative to the base station, with an accuracy of two to five centimetres. The GNSS measurements enable the researchers to define the “true” motion of the platform, and then compare their processed results to the true path to see how well their algorithmic innovations work.
“Robots measure things that are relative to them, taking image or laser scans of the environment immediately around them,” Waslander said. “You’re collecting information as you move through the environment. If there’s an error in how you think you’re moving, every step in a new area accumulates more error. Over time, the errors grow and grow.”
The researchers’ approach seeks to minimise those errors and enable the team to fly an Unmanned Aerial Vehicle (UAV) even in environments where GNSS isn’t available—through tunnels or under a canopy of trees, for example—while assuring them that the vision system is working the way they expect it to. The next step is to build maps from the visual data collected by the onboard sensors, reconstructing the obstacles, trees and buildings around the robot.
“My end goal is to use both systems together for inspection operations; so, we have RTK from NovAtel and vision estimates of our motion, and we can compare and fuse these measurements to monitor the quality of both sets of data,” Waslander said. “When GPS fails, the vision can take over, and when we are far away from useful visual features, the GPS measurements will be more than enough to safely and accurately control the vehicle.”
With NovAtel’s solution, Waslander said he can trust that the data gathered is precise and will enable the team to know exactly how reliable the measurements are.