Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Interactive Perception for Object Detection
The goal of the project is to develop a pipeline capable of finding a desired object through interaction with the environment. The initial algorithm will be trained and tested in simulation, and later on transferred to the real robot.
Most current approaches for robotic perception perceive the environment passively through sensors, without physical interaction. However, it is known that human perception benefits from interaction with the environment, where we exploit different types of physical interactions with the objects in order to enhance perception, and, thereby, ease manipulation and facilitate decision making.
This project aims to develop an integrated perception-manipulation framework able to deal with the longstanding problems of: occlusion, viewpoints changes, scene changes, and clutter. Interactive perception will allow the robot to perceive and understand the environment in much more detail than passive observations can provide. By being able to physically interact with the objects the robot can infer more detailed information about the underlying properties and structure of different objects that otherwise would not be observable. For example finding a specific object in a pile would require the robot to first move the objects out of the way that are occluding the object that we are interested in.
During this project, the student will have the opportunity to join an active team of developers and be invited to contribute to a common framework. The student will also have the chance to work with different state-of-the-art RGB-D sensors and a dual-arm YuMi robot. Finally, by the end of the project, the student will have developed considerable knowledge in the very interesting topics of reinforcement learning, object detection and object segmentation, and will be invited to publish his/her work in case of a successful project.
Most current approaches for robotic perception perceive the environment passively through sensors, without physical interaction. However, it is known that human perception benefits from interaction with the environment, where we exploit different types of physical interactions with the objects in order to enhance perception, and, thereby, ease manipulation and facilitate decision making.
This project aims to develop an integrated perception-manipulation framework able to deal with the longstanding problems of: occlusion, viewpoints changes, scene changes, and clutter. Interactive perception will allow the robot to perceive and understand the environment in much more detail than passive observations can provide. By being able to physically interact with the objects the robot can infer more detailed information about the underlying properties and structure of different objects that otherwise would not be observable. For example finding a specific object in a pile would require the robot to first move the objects out of the way that are occluding the object that we are interested in.
During this project, the student will have the opportunity to join an active team of developers and be invited to contribute to a common framework. The student will also have the chance to work with different state-of-the-art RGB-D sensors and a dual-arm YuMi robot. Finally, by the end of the project, the student will have developed considerable knowledge in the very interesting topics of reinforcement learning, object detection and object segmentation, and will be invited to publish his/her work in case of a successful project.
- Make yourself familiar with our code base.
- Perform a literature review on interactive perception.
- Select and implement the most promising strategy for performing interactive perception in simulation.
- Train and evaluate the algorithm in simulation assuming the full knowledge of the perception part.
- Combine the developed algorithm with a perception pipeline.
- Transfer the developed pipeline to the real robot.
- Build upon the state of the art by developing your own ideas and your supervisor's input.
- Design and conduct experiments with a robot to evaluate the selected approach.
- Make yourself familiar with our code base. - Perform a literature review on interactive perception. - Select and implement the most promising strategy for performing interactive perception in simulation. - Train and evaluate the algorithm in simulation assuming the full knowledge of the perception part. - Combine the developed algorithm with a perception pipeline. - Transfer the developed pipeline to the real robot. - Build upon the state of the art by developing your own ideas and your supervisor's input. - Design and conduct experiments with a robot to evaluate the selected approach.
- Strong self-motivation and curiosity for solving challenging robotic perception problems.
- Excellent programming skills (i.e. having written several thousand lines of code) ideally in Python/C++ and the ability to work on large code bases.
- In-depth knowledge in at least two of the three following areas: Machine Learning, Optimization, and Computer Vision.
- Experience with Linux, ROS, and typical development tools such as git or Jenkins are advantageous.
- A very good academic record is desirable but may be compensated by expert knowledge in the areas mentioned above.
- Strong self-motivation and curiosity for solving challenging robotic perception problems. - Excellent programming skills (i.e. having written several thousand lines of code) ideally in Python/C++ and the ability to work on large code bases. - In-depth knowledge in at least two of the three following areas: Machine Learning, Optimization, and Computer Vision. - Experience with Linux, ROS, and typical development tools such as git or Jenkins are advantageous. - A very good academic record is desirable but may be compensated by expert knowledge in the areas mentioned above.
If you are interested in this project, please send your transcripts and CV to Tonci Novkovic (tonci.novkovic@mavt.ethz.ch) and Fadri Furrer (fadri.furrer@mavt.ethz.ch)
If you are interested in this project, please send your transcripts and CV to Tonci Novkovic (tonci.novkovic@mavt.ethz.ch) and Fadri Furrer (fadri.furrer@mavt.ethz.ch)