The project is a collaboration between U of T Engineering, Drone Delivery Canada (DDC) and Defence Research and Development Canada (DRDC). Professor Angela Schoellig (UTIAS) is leading the team, which also includes Professor Tim Barfoot (UTIAS).
“If drones are going to move beyond their current applications, they will need to be able to deal with all kinds of challenging conditions without help from ground-based operators,” says Schoellig. “Visual navigation provides a way forward, and we are excited to apply our expertise in this field to develop a solution.”
Currently, federal regulations require that UAVs remain within visual range of their operators. The idea is that if there is a problem, the operator will notice and step in with manual controls to avert a crash. But in the future, companies like DDC envision using drones over much larger distances, where maintaining visual contact will be impossible.
“In Canada, there are hundreds of remote communities that lack infrastructure, specifically roads, which makes access to goods both very limited and very expensive,” says Tony Di Benedetto, CEO of DDC. “We are looking at utilizing drones as a logistics vehicle to help our fellow Canadians in these communities.”
Current GPS technology allows UAVs to navigate without the help of a ground crew. But if this system gets disrupted by bad weather or malfunctioning equipment — and if the drone is beyond the reach of radio communication — the drone could lose control and crash, posing a safety hazard, not to mention the financial cost of lost equipment and cargo.
Schoellig and her team are proposing to develop a navigation system that relies instead on digital photos taken by drones as they fly. If communications or GPS are disrupted, the drone could use those images to find its way back to its launch point.
“Think of it as Hansel and Gretel with breadcrumbs,” says Barfoot. “We’re tracing our path on the outbound flight, and then if something goes wrong we can follow the trail back to where we started.”
Though the concept may seem simple, the execution is not. The system needs to be able to recognize the same scene in different lighting and weather conditions, and to account for jostling due to wind. It also needs to be self-contained, executable on a computer small enough to fit on the drone without access to an external network, and consume very little power.
The project is funded by an DND/NSERC Research Partnership Grant, with approximately half of the funding contributed by the two external partners, DDC and DRDC. In total, the partnership will provide more than $450,000 over four years, after which time the team aims to have the first prototypes ready to fly.
“Creating algorithms that can adapt to the very complex conditions of the real world is very challenging, but the potential gains are enormous,” says Schoellig. “It would open up a whole new range of possible applications, not only for our partners, but for the entire industry.”
Di Benedetto agrees. “We see this as a critical component for the evolution of drone technology in Canada,” he says. “We’re doing something completely new.”