Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Robust odometry for ground robots
Ground robots offer great opportunities to perform autonomous tasks in many applications. However, their state estimation is not trivial due to wheel slippage, IMU bias unobservability and in GPS denied areas. In this project, we want to investigate new approaches to facilitate sensor fusion in order to achieve robust and high accuracy state estimation for ground robots.
Simultaneous Localization and Mapping (SLAM) is a crucial capability for many mobile ground robot vehicles, in applications such as service robotics, industry inspection and automation. The goal is to jointly estimate the trajectory and location of the robot from noisy sensor data.
For this purpose, Inertial Measurement Units (IMUs) have been used with great success in combination with cameras [1,2]. While these systems are well suited for flying platforms, they suffer from certain issues on ground vehicles. The main issues being the estimation of the IMU bias, a systemic noise, which must be separated from the measurement of gravity and the robots actual movement.
In this project, as a solution we plan to investigate different other sensors or assumptions that can be used to better perform visual-inertial state estimation on ground robots. Possible approaches include a fusion of wheel odometry and IMU into the state estimation process or a solution to the bias problem through assumptions of local planarity.
Simultaneous Localization and Mapping (SLAM) is a crucial capability for many mobile ground robot vehicles, in applications such as service robotics, industry inspection and automation. The goal is to jointly estimate the trajectory and location of the robot from noisy sensor data.
For this purpose, Inertial Measurement Units (IMUs) have been used with great success in combination with cameras [1,2]. While these systems are well suited for flying platforms, they suffer from certain issues on ground vehicles. The main issues being the estimation of the IMU bias, a systemic noise, which must be separated from the measurement of gravity and the robots actual movement.
In this project, as a solution we plan to investigate different other sensors or assumptions that can be used to better perform visual-inertial state estimation on ground robots. Possible approaches include a fusion of wheel odometry and IMU into the state estimation process or a solution to the bias problem through assumptions of local planarity.
- Literature review and familiarization with current tools
- Data collection on existing robotic platforms
- Exploring various solutions to bias and/or scale estimation
- Evaluation results on collected datasets and on real robot
- Literature review and familiarization with current tools - Data collection on existing robotic platforms - Exploring various solutions to bias and/or scale estimation - Evaluation results on collected datasets and on real robot
We are looking for a highly motivated student with experience in C++ programming and interests in computer vision and robotics. Previous knowledge in ROS, OpenCV and with IMUs is a plus.
We are looking for a highly motivated student with experience in C++ programming and interests in computer vision and robotics. Previous knowledge in ROS, OpenCV and with IMUs is a plus.
If you are interested, please contact Andrei Cramariuc (andrei.cramariuc@mavt.ethz.ch) and Florian Tschopp (florian.tschopp@mavt.ethz.ch), sending your CV, transcript of records and a short paragraph on why you want to work on this project.
If you are interested, please contact Andrei Cramariuc (andrei.cramariuc@mavt.ethz.ch) and Florian Tschopp (florian.tschopp@mavt.ethz.ch), sending your CV, transcript of records and a short paragraph on why you want to work on this project.