Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Odometry and Mapping in Dynamic Environments
Existing lidar-inertial odometry approaches (e.g., FAST-LIO2 [1]) are capable of providing sufficiently accurate pose estimation in structured environments to capture high quality 3D maps of static structures in real-time. However, the presence of dynamic objects in an environment can reduce the accuracy of the odometry estimate and produce noisy artifacts in the captured 3D map. Existing approaches to handling dynamic objects [2-4] focus on detecting and filtering them from the captured 3D map but typically operate independently from the odometry pipeline, which means that the dynamic filtering does not improve the pose estimation accuracy.
The goal of this project is to develop a lidar-inertial odometry approach that tightly integrates dynamic object filtering into the pose estimation and mapping pipeline. The starting point will be investigating whether changes in a set of geometric primitives extracted from the FAST-LIO2 [1] odometry pipeline can be used to detect dynamic objects. The performance of the resulting approach will be evaluated in comparison with existing odometry [1] and dynamic object filtering [2-4] approaches.
References:
[1] W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “FAST-LIO2: Fast Direct LiDAR-inertial Odometry,” arXiv, 2021.
[2] L. Schmid, O. Andersson, A. Sulser, P. Pfreundschuh, and R. Siegwart, “Dynablox: Real-Time Detection of Diverse Dynamic Objects in Complex Environments,” IEEE Robotics and Automation Letters, 2023.
[3] D. Duberg, Q. Zhang, M. Jia, and P. Jensfelt, “DUFOMap: Efficient Dynamic Awareness Mapping,” IEEE Robotics and Automation Letters, 2024.
[4] H. Lim, S. Hwang, and H. Myung, “ERASOR: Egocentric Ratio of Pseudo Occupancy-based Dynamic Object Removal for Static 3D Point Cloud Map Building,” IEEE Robotics and Automation Letters, 2021.
The goal of this project is to develop a lidar-inertial odometry approach that tightly integrates dynamic object filtering into the pose estimation and mapping pipeline. The starting point will be investigating whether changes in a set of geometric primitives extracted from the FAST-LIO2 [1] odometry pipeline can be used to detect dynamic objects. The performance of the resulting approach will be evaluated in comparison with existing odometry [1] and dynamic object filtering [2-4] approaches.
References: [1] W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “FAST-LIO2: Fast Direct LiDAR-inertial Odometry,” arXiv, 2021. [2] L. Schmid, O. Andersson, A. Sulser, P. Pfreundschuh, and R. Siegwart, “Dynablox: Real-Time Detection of Diverse Dynamic Objects in Complex Environments,” IEEE Robotics and Automation Letters, 2023. [3] D. Duberg, Q. Zhang, M. Jia, and P. Jensfelt, “DUFOMap: Efficient Dynamic Awareness Mapping,” IEEE Robotics and Automation Letters, 2024. [4] H. Lim, S. Hwang, and H. Myung, “ERASOR: Egocentric Ratio of Pseudo Occupancy-based Dynamic Object Removal for Static 3D Point Cloud Map Building,” IEEE Robotics and Automation Letters, 2021.
- WP1: Literature review of work on lidar-inertial odometry and dynamic object detection.
- WP2: Develop a lidar-inertial odometry approach that can robustly handle dynamic environments.
- WP3: Evaluate the performance of the approach in comparison with existing work.
- WP1: Literature review of work on lidar-inertial odometry and dynamic object detection. - WP2: Develop a lidar-inertial odometry approach that can robustly handle dynamic environments. - WP3: Evaluate the performance of the approach in comparison with existing work.
Experience with C++ and ROS.
Experience with C++ and ROS.
Please send CV and transcripts to Rowan Border (border.rowan@ucy.ac.cy) and Ruben Mascaro (rmascaro@ethz.ch).
Please send CV and transcripts to Rowan Border (border.rowan@ucy.ac.cy) and Ruben Mascaro (rmascaro@ethz.ch).