Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Multi Sensor Fusion for High Accuracy Robot Navigation
The overall goal of this project is to develop a navigation platform that allows autonomous robots to map the world and localize within the created map with high precision (sub-cm) with a multi-sensor system.
The overall goal of this project is to develop a navigation platform that allows autonomous robots to map the world and localize within the created map with high precision (sub-cm). The sensor suite that should enable this task is comprised of stereo cameras (Mynteye), depth cameras and LiDAR sensors (Ouster OS0-64). The first crucial step to enable the fusion of the different sensor modalities is the calibration of intrinsic and extrinsic parameters of the sensors, as well as estimating the time offsets between the individual sensors. We rely on the in the robotics community well established calibration platform Kalibr, and plan to further extend it.
The overall goal of this project is to develop a navigation platform that allows autonomous robots to map the world and localize within the created map with high precision (sub-cm). The sensor suite that should enable this task is comprised of stereo cameras (Mynteye), depth cameras and LiDAR sensors (Ouster OS0-64). The first crucial step to enable the fusion of the different sensor modalities is the calibration of intrinsic and extrinsic parameters of the sensors, as well as estimating the time offsets between the individual sensors. We rely on the in the robotics community well established calibration platform Kalibr, and plan to further extend it.
- Literature review of state-of-the-art calibration methods
- Setup of the individual sensors and a calibration environment
- Intrinsic and extrinsic calibration of a LiDAR sensor
- Extrinsic calibration of a reference system (Optitrack)
- Time offset calibration between all involved sensors
- Enable the use of 3D Targets for camera calibration
- Evaluation of the system
- Literature review of state-of-the-art calibration methods - Setup of the individual sensors and a calibration environment - Intrinsic and extrinsic calibration of a LiDAR sensor - Extrinsic calibration of a reference system (Optitrack) - Time offset calibration between all involved sensors - Enable the use of 3D Targets for camera calibration - Evaluation of the system
- Highly motivated and independent student
- Interest in sensors, calibration
- Good programming skills in Python and C++
- Experience in ROS, Git
- Highly motivated and independent student - Interest in sensors, calibration - Good programming skills in Python and C++ - Experience in ROS, Git
Please send your cv and transcripts (BA and MA) to Abel Gawel (gawela@ethz.ch) and Michael Helmberger (michael.helmberger@hilti.com)
Please send your cv and transcripts (BA and MA) to Abel Gawel (gawela@ethz.ch) and Michael Helmberger (michael.helmberger@hilti.com)