Collaborative Navigation for Flying and Walking Robots
Flying and walking robots can use their complementary features in terms of viewpoint and payload capability to the best in a heterogeneous team.
This work is under review as:
P. Fankhauser, M. Bloesch, P. Krüsi, R. Diethelm, M. Wermelinger, T. Schneider, M. Dymczyk, M. Hutter and R. Siegwart, “Collaborative Navigation for Flying and Walking Robots ”, in IEEE International Conference on Robotics and Automation (ICRA), 2016. (submitted)
Flying and walking robots can use their complementary features in terms of viewpoint and payload capability to the best in a heterogeneous team. To this end, we present our online collaborative navigation framework for unknown and challenging terrain. The method leverages the flying robot’s onboard monocular camera to create both a map of visual features for simultaneous localization and mapping and a dense representation of the environment as an elevation map. This prior knowledge from the initial exploration enables the walking robot to localize itself against the global map, and plan a global path to the goal by interpreting the elevation map in terms of traversability. While following the planned path, the absolute pose corrections are fused with the legged state estimation and the elevation map is continuously updated with distance measurements from an onboard laser range sensor. This allows the legged robot to safely navigate towards the goal while taking into account any changes in the environment.