In search-and-rescue scenarios, where time is critical and the environment can pose great hazards, collaboration amongst multiple robots plays a key role. While aerial robots are great to map large outdoor scenes, they face major problems to inspect confined and occluded spaces that are not visible from the air. On the other hand, a plethora of specialised ground robots can explore indoor and tight areas. However, ground robots are in general not as fast as aerial robots. As such, in this project, we aim to combine the strengths of both aerial and ground robots in order to enable efficient and robust mapping of the search area to be inspected, aiding decision-making by first responders and minimizing the exposure of rescuers to unnecessary risks. In order to do so, an aerial robot will be deployed to perform both mapping and localization of the ground robot while the ground robot will be used to explore areas that are not visible from the air.
The student working on this project will be expected to extend the library developed at V4RL for co-localization of robots [1] to work with flexible robots, such as the EPFL Krock 2 robot [2]. Besides this, the student also needs to integrate a real-time aerial mapping system [3] and compute the areas that need to be inspected by the ground robot.
[1] Teixeira, L., Maffra, F., Moos, M., & Chli, M., VI-RPE: Visual-inertial relative pose estimation for aerial vehicles. IEEE Robotics and Automation Letters, 3(4), 2770-2777.
[2] Melo, Kamilo, Tomislav Horvat, and Auke J. Ijspeert. "K-Rock a Bio-robot Outside the Lab Back In Nature." AMAM 8th International Symposium on Adaptive Motion of Animals and Machines. 2017.
[3] Teixeira, L., & Chli, M.. Real-time mesh-based scene estimation for aerial inspection, IROS 2016
In search-and-rescue scenarios, where time is critical and the environment can pose great hazards, collaboration amongst multiple robots plays a key role. While aerial robots are great to map large outdoor scenes, they face major problems to inspect confined and occluded spaces that are not visible from the air. On the other hand, a plethora of specialised ground robots can explore indoor and tight areas. However, ground robots are in general not as fast as aerial robots. As such, in this project, we aim to combine the strengths of both aerial and ground robots in order to enable efficient and robust mapping of the search area to be inspected, aiding decision-making by first responders and minimizing the exposure of rescuers to unnecessary risks. In order to do so, an aerial robot will be deployed to perform both mapping and localization of the ground robot while the ground robot will be used to explore areas that are not visible from the air. The student working on this project will be expected to extend the library developed at V4RL for co-localization of robots [1] to work with flexible robots, such as the EPFL Krock 2 robot [2]. Besides this, the student also needs to integrate a real-time aerial mapping system [3] and compute the areas that need to be inspected by the ground robot.
[1] Teixeira, L., Maffra, F., Moos, M., & Chli, M., VI-RPE: Visual-inertial relative pose estimation for aerial vehicles. IEEE Robotics and Automation Letters, 3(4), 2770-2777. [2] Melo, Kamilo, Tomislav Horvat, and Auke J. Ijspeert. "K-Rock a Bio-robot Outside the Lab Back In Nature." AMAM 8th International Symposium on Adaptive Motion of Animals and Machines. 2017. [3] Teixeira, L., & Chli, M.. Real-time mesh-based scene estimation for aerial inspection, IROS 2016
WP1: Literature review and familiarization with our existing relative pose estimation system
implementation.
WP2: Implementation of an aerial model-based co-localization between a flexible ground robot
and a UAV .
WP3: Integration of the new system with a state-of-the-art real-time mapping algorithm.
WP4: Experimentation of the new method.
WP5: Report and presentation
WP1: Literature review and familiarization with our existing relative pose estimation system implementation. WP2: Implementation of an aerial model-based co-localization between a flexible ground robot and a UAV . WP3: Integration of the new system with a state-of-the-art real-time mapping algorithm. WP4: Experimentation of the new method. WP5: Report and presentation
The student taking this project needs to be highly motivated, preferably with strong analytical skills, while experience in coding in C/C++ and computer vision would be very beneficial.
The student taking this project needs to be highly motivated, preferably with strong analytical skills, while experience in coding in C/C++ and computer vision would be very beneficial.
Lucas Teixeira, lteixeira@mavt.ethz.ch
Fabiola Maffra, fmaffra@mavt.ethz.ch
Matthew Estrada, matthew.estrada@epfl.ch
Margarita Chli, chlim@ethz.ch
Lucas Teixeira, lteixeira@mavt.ethz.ch Fabiola Maffra, fmaffra@mavt.ethz.ch Matthew Estrada, matthew.estrada@epfl.ch Margarita Chli, chlim@ethz.ch