Docking on the Fly

Docking on the Fly

Waslander and the WAVELab team have also used NovAtel’s RTK positioning in precise docking, successfully landing an aerial vehicle on a moving ground target during one of their experiments.

To make this happen, the team put NovAtel OEMStar® GPS+GLONASS single-frequency receivers on an Ascending Technologies (AscTec) Pelican quadroter drone and a Clearpath Husky ground vehicle. They used relative positioning to determine how far apart the two vehicles were and a decentralized controller to converge on a common point, enabling the drone to land on the ground vehicle autonomously. Again, the relative accuracy required was in the two-to-five-centimetre range. Multiple successful landings were performed, even in 15 kph winds.

“We got great signals on both units,” Waslander said. “You can rely almost entirely on the RTK solution for the landing.”
Although they haven’t landed a UAV on a slow-moving boat using RTK just yet, it’s an experiment the WAVELab team would like to take on. Normally, accomplishing such a feat involves cutting power to the motors over a safety net or enlisting someone on the boat to seize the legs of the UAV and perform a “grab and kill maneuver.” Neither version is fully autonomous or safe for either the operator or the UAV. Fully automating the landing procedure reduces the associated risks.

Waterloo 1Much like with a moving ground rover, landing a UAV on a moving boat requires accurate relative position measurements of the quadrotor, meaning using GNSS alone just isn’t enough. Standalone position measurement errors on the UAV and boat can add up to 10 metres in position error between the two vehicles, making a successful landing impossible. NovAtel’s RTK system eliminates that error.

The ability to safely land a UAV on ground or water-based platforms offers a variety of benefits, Waslander said, including making it possible to swap out batteries and sensors during field operations in order to extend mission times.

“The aerial vehicles are very limited in payload capacity and battery life; so, you can only do small missions,” Waslander said. “We looked at trying to couple them with ground vehicles and boats because they can carry extra batteries and swappable sensors. So, you can dock and swap the batteries or the sensors, then boat or drive off to the next task.” 

Beyond that, teaming up unmanned aerial and ground vehicles can dramatically improve ground vehicle coverage for search or mapping operations, Waslander said, while allowing the aerial vehicle to go further afield without running out of power.

This ability also allows automatic deployment from ships, he pointed out, enabling UAVs, for example, to scout for or track icebergs outside their ship’s field of view. This application could be useful for oil drilling in the Arctic or enabling fully autonomous border surveillance and other long-duration aerial missions in which multiple vehicles are switched in and out of operation as needed.

This same technology could also make it possible to dock UAVs on cars, trucks, trains, and large aircraft that are in motion, Waslander said, opening up even more applications.

Why the Research Matters

With the emergence of a commercial UAS market, robots now have the capability to perform many tasks, from precision agriculture to cinematography.

Waslander predicts the next wave of advancement in this technology will focus on precision positioning and relative positioning, making it possible to perform more complicated tasks such as detailed pipeline and mine inspections. This is an area in which the WAVELab is looking to advance—with the help of high-accuracy, global and relative positioning solutions from NovAtel.

“My sense is we’re on the cusp of big things in robotics, and from what I’ve seen it’s finally at the point where the sensing is good enough and the computation is fast enough that we can make smart decisions on robots in real time,” Waslander said.

“The kind of information we can collect will enable us to make smarter decisions about the operations of facilities and infrastructures, but we can only do that if we precisely control how the vehicle moves in relation to dangerous objects,” he added. “That’s why we’re focused on getting vision, GPS, and other sensors all integrated into a common picture of what’s going on around the robot, so that it can make the right decision and keep itself safe while still providing the information the operator needs.