Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Multi-sensor fusion for fast and accurate mobile robot perception
The goal of this project is to reliably fuse and classify 2D lidar and 3D depth image data on a mobile robot. The resulting obstacle map is used for path planning.
In many of today’s applications, the material handling industry uses 2D lidars on mobile robots solely to trigger safety stops as soon as an object is nearby. In addition, mostly static environments are assumed, which significantly limits the scope of possible areas of application of those mobile platforms. At Sevensense we want to revolutionize these approaches and believe that far more can be achieved with existing sensor technologies. In this project we would particularly like to focus on an increased perceptual understanding, which will improve the robot’s decision-making abilities (such as path planning) and improve the interaction with humans.
The aim of this project is therefore to develop novel techniques to optimally use existing depth sensor setups for path planning and possibly other interactions with the environment. In particular, you will work with 2D lidar sensors as well as stereo cameras and other 3D depth sources in order to develop the best possible understanding of the environment. On the one hand, this involves a fusion of 2D and 3D data, but also the classification of object types and their expected movement in the case of dynamic objects. Last but not least, the data should be condensed in a map representation in order to achieve improved path planning actions of the mobile robot.
During this project, you will develop the code (including C++, CUDA and Python) within a common codebase, learning the industry standard development practices that include code reviews and unit testing. Finally, by the end of the project, you will have contributed to ongoing research and developed considerable knowledge in the quickly developing topics related to mobile robot perception, using both cameras and lidars. The hands-on experience with a mobile robotic platform will further augment this experience with a deep understanding of software-hardware interplay. You will Join an active team of developers at Sevensense: a Zurich-based robotics startup, and collaborate with the Autonomous Systems Lab: one of the largest robotic research teams in the world.
In many of today’s applications, the material handling industry uses 2D lidars on mobile robots solely to trigger safety stops as soon as an object is nearby. In addition, mostly static environments are assumed, which significantly limits the scope of possible areas of application of those mobile platforms. At Sevensense we want to revolutionize these approaches and believe that far more can be achieved with existing sensor technologies. In this project we would particularly like to focus on an increased perceptual understanding, which will improve the robot’s decision-making abilities (such as path planning) and improve the interaction with humans.
The aim of this project is therefore to develop novel techniques to optimally use existing depth sensor setups for path planning and possibly other interactions with the environment. In particular, you will work with 2D lidar sensors as well as stereo cameras and other 3D depth sources in order to develop the best possible understanding of the environment. On the one hand, this involves a fusion of 2D and 3D data, but also the classification of object types and their expected movement in the case of dynamic objects. Last but not least, the data should be condensed in a map representation in order to achieve improved path planning actions of the mobile robot.
During this project, you will develop the code (including C++, CUDA and Python) within a common codebase, learning the industry standard development practices that include code reviews and unit testing. Finally, by the end of the project, you will have contributed to ongoing research and developed considerable knowledge in the quickly developing topics related to mobile robot perception, using both cameras and lidars. The hands-on experience with a mobile robotic platform will further augment this experience with a deep understanding of software-hardware interplay. You will Join an active team of developers at Sevensense: a Zurich-based robotics startup, and collaborate with the Autonomous Systems Lab: one of the largest robotic research teams in the world.
- Make yourself familiar with our perception framework as well as current state-of-the-art dense mapping solutions.
- Investigate different fusing methods and design and implement the most promising strategy.
- Build upon the state of the art by developing your own ideas and your supervisor's input.
- Design and conduct experiments with a mobile robot to evaluate the selected approach.
- Make yourself familiar with our perception framework as well as current state-of-the-art dense mapping solutions. - Investigate different fusing methods and design and implement the most promising strategy. - Build upon the state of the art by developing your own ideas and your supervisor's input. - Design and conduct experiments with a mobile robot to evaluate the selected approach.
- Strong self-motivation and curiosity for solving challenging robotic problems.
- Excellent programming skills (i.e. having written several thousand lines of code) in C++ and the ability to work on large code bases.
- Experience with Linux, ROS, and typical development tools such as git are advantageous.
- A very good academic record is desirable but may be compensated by good knowledge in the areas mentioned above.
- Strong self-motivation and curiosity for solving challenging robotic problems. - Excellent programming skills (i.e. having written several thousand lines of code) in C++ and the ability to work on large code bases. - Experience with Linux, ROS, and typical development tools such as git are advantageous. - A very good academic record is desirable but may be compensated by good knowledge in the areas mentioned above.
If you are interested, please send your CV and transcripts to Jordan Burklund (jordan.burklund@sevensense.ch), Fabian Blöchliger (fabian.bloechliger@sevensense.ch) and Renaud Dubé (renaud.dube@sevensense.ch).
If you are interested, please send your CV and transcripts to Jordan Burklund (jordan.burklund@sevensense.ch), Fabian Blöchliger (fabian.bloechliger@sevensense.ch) and Renaud Dubé (renaud.dube@sevensense.ch).