As part of the DARPA Subterranean challenge (https://www.subtchallenge.com/) teams are required to autonomously explore and accurately map a networked cave system below ground without little-to-no communication from an operator. The ASL has significant experience on mapping onboard Micro Aerial Vehicles (MAVs) however the challenging conditions, low-light, dust,` and narrow spaces have necessitated a new approach.
Rapid development in 3D LIDAR (Light Detection and Ranging) technology in recent years has led to sensors that are now light enough to be carried by small Micro Aerial Vehicles (MAVs). This sensing modality, which has been the mainstay of sensing on ground robots for decades, offers some advantages over cameras, which are traditionally used on MAVs. However, using LIDARS on MAVs is challenging, the primary reason for which is that MAV motion is highly dynamic, which complicates the mapping process. This project aims to address the challenges of LIDAR-based mapping on MAVs by combining them with the complementary features of camera-based motion estimation.
As part of the DARPA Subterranean challenge (https://www.subtchallenge.com/) teams are required to autonomously explore and accurately map a networked cave system below ground without little-to-no communication from an operator. The ASL has significant experience on mapping onboard Micro Aerial Vehicles (MAVs) however the challenging conditions, low-light, dust,` and narrow spaces have necessitated a new approach.
Rapid development in 3D LIDAR (Light Detection and Ranging) technology in recent years has led to sensors that are now light enough to be carried by small Micro Aerial Vehicles (MAVs). This sensing modality, which has been the mainstay of sensing on ground robots for decades, offers some advantages over cameras, which are traditionally used on MAVs. However, using LIDARS on MAVs is challenging, the primary reason for which is that MAV motion is highly dynamic, which complicates the mapping process. This project aims to address the challenges of LIDAR-based mapping on MAVs by combining them with the complementary features of camera-based motion estimation.
Familiarization with our current mapping pipeline
Literature review
Development of algorithms
Evaluation and testing using one of our MAV platforms
Familiarization with our current mapping pipeline Literature review Development of algorithms Evaluation and testing using one of our MAV platforms
(In order of importance)
Ability to work independently.
Experience with C++
Background and interest in linear systems, estimation, deep learning, or optimization.
Interest in working with robotics hardware
ROS knowledge (beneficial)
Python (beneficial)
(In order of importance) Ability to work independently. Experience with C++ Background and interest in linear systems, estimation, deep learning, or optimization. Interest in working with robotics hardware ROS knowledge (beneficial) Python (beneficial)
If you are interested in this project, please send your transcripts and CV to - Alex Millane (alexander.millane@mavt.ethz.ch), Christian Lanegger (christian.lanegger@mavt.ethz.ch), Victor Reijgwart (victor.reijgwart@ethz-asl.ch)
If you are interested in this project, please send your transcripts and CV to - Alex Millane (alexander.millane@mavt.ethz.ch), Christian Lanegger (christian.lanegger@mavt.ethz.ch), Victor Reijgwart (victor.reijgwart@ethz-asl.ch)