Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Learning from demonstration in virtual reality
The goal of this project is to teach a robot new manipulation skills by providing demonstrations collected in VR using a tracking system such as the Oculus Rift.
Future generations of robots will leave the controlled settings of factory floors and research labs, opening new applications in unstructured and dynamic environments. The lack of global knowledge and the large variety of possible scenarios render traditional programming of robot movements intractable. A promising approach to speed up this process is Learning from Demonstrations (LfD). LfD considers the problem of acquiring new skills by imitating an expert and has been successfully applied to driving and drone control tasks. However, the collection of data for robot manipulation using visual feedback remains challenging due to visual artifacts and non-trivial correspondences between demonstrations and robot motion.
The goal of this project is teach a two-armed collaborative robot specific manipulation skill through supervised learning. This will require to implement a data acquisition pipeline for collecting camera images and the tracked motion of a demonstrator, as well as developing a mapping between the collected trajectories and robot motion. Possible approaches include fully virtual simulation or teleoperation of the physical system using an Oculus Rift as interface between the demonstrator and robot. The final system will be tested by training the robot to perform selected manipulation tasks, for example block stacking or decluttering.
During this project, the student will have the opportunity to join an active team of developers and invited to contribute to the lab’s codebase. The student will have the chance to work with an Oculus Rift, state-of-the-art RGB-D sensors and a dual-arm YuMi robot. Finally, by the end of the project, the student will have developed considerable knowledge in the topics of virtual reality and deep imitation learning, and will be invited to publish his/her work in case of a successful project.
Future generations of robots will leave the controlled settings of factory floors and research labs, opening new applications in unstructured and dynamic environments. The lack of global knowledge and the large variety of possible scenarios render traditional programming of robot movements intractable. A promising approach to speed up this process is Learning from Demonstrations (LfD). LfD considers the problem of acquiring new skills by imitating an expert and has been successfully applied to driving and drone control tasks. However, the collection of data for robot manipulation using visual feedback remains challenging due to visual artifacts and non-trivial correspondences between demonstrations and robot motion.
The goal of this project is teach a two-armed collaborative robot specific manipulation skill through supervised learning. This will require to implement a data acquisition pipeline for collecting camera images and the tracked motion of a demonstrator, as well as developing a mapping between the collected trajectories and robot motion. Possible approaches include fully virtual simulation or teleoperation of the physical system using an Oculus Rift as interface between the demonstrator and robot. The final system will be tested by training the robot to perform selected manipulation tasks, for example block stacking or decluttering.
During this project, the student will have the opportunity to join an active team of developers and invited to contribute to the lab’s codebase. The student will have the chance to work with an Oculus Rift, state-of-the-art RGB-D sensors and a dual-arm YuMi robot. Finally, by the end of the project, the student will have developed considerable knowledge in the topics of virtual reality and deep imitation learning, and will be invited to publish his/her work in case of a successful project.
- Literature Review
- Implement a data collection pipeline using an Oculus Rift and existing tools such as Blender, Gazebo, or Unreal Engine.
- Develop a mapping between the tracked hand positions and robot commands.
- Implement a simple manipulation task using an imitation learning approach.
- Evaluate the performance of the system on the selected task.
- Literature Review - Implement a data collection pipeline using an Oculus Rift and existing tools such as Blender, Gazebo, or Unreal Engine. - Develop a mapping between the tracked hand positions and robot commands. - Implement a simple manipulation task using an imitation learning approach. - Evaluate the performance of the system on the selected task.
- Strong self-motivation and curiosity for solving challenging robotic problems.
- Excellent programming skills (i.e. having written several thousand lines of code) ideally in C++/Python.
- In-depth knowledge in at least two of the three following areas: Machine Learning, Optimization, and Computer Vision.
- Experience working with Oculus Rift and/or game engines is of advantage.
- A very good academic record is desirable but may be compensated by expert knowledge in the areas mentioned above.
- Strong self-motivation and curiosity for solving challenging robotic problems. - Excellent programming skills (i.e. having written several thousand lines of code) ideally in C++/Python. - In-depth knowledge in at least two of the three following areas: Machine Learning, Optimization, and Computer Vision. - Experience working with Oculus Rift and/or game engines is of advantage. - A very good academic record is desirable but may be compensated by expert knowledge in the areas mentioned above.
If you are interested in this project, please send your transcripts and CV to Michel Breyer (michel.breyer@mavt.ethz.ch), Tonci Novkovic (tonci.novkovic@mavt.ethz.ch) and Fadri Furrer (fadri.furrer@mavt.ethz.ch).
If you are interested in this project, please send your transcripts and CV to Michel Breyer (michel.breyer@mavt.ethz.ch), Tonci Novkovic (tonci.novkovic@mavt.ethz.ch) and Fadri Furrer (fadri.furrer@mavt.ethz.ch).