Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Learning Object Manipulation from Demonstrations using Vision and Haptic Feedback
We aim to develop a method to incorporate fine-grained tactile and visual feedback into our haptic teleoperation setup and investigate their effectiveness with state-of-the-art imitation learning methods.
In recent years, there has been significant progress in robotic manipulation, with applications ranging from household tasks to industrial automation. One of the key challenges in this area is enabling robots to manipulate objects with skill and precision in real-world environments. Videos of demonstrations have proven to be an effective and intuitive way to teach robots complex manipulation tasks [1,2].
However, they often lack the fine-grained tactile feedback that humans use to adjust their grip and movement during object manipulation. In this project, we would like to investigate how we can better use tactile measurements from a two-finger gripper equipped with optical tactile sensors [3] to learn various manipulation tasks from demonstrations.
To this end, we aim to develop a method to incorporate fine-grained tactile and visual feedback into our haptic teleoperation setup and investigate their effectiveness with state-of-the-art imitation learning methods.
[1]: Zhao, Tony Z., et al. "Learning fine-grained bimanual manipulation with low-cost hardware."
[2]: Chi, C., et al. Diffusion Policy: Visuomotor Policy Learning via Action Diffusion.
[3] https://www.gelsight.com/gelsightmini/
In recent years, there has been significant progress in robotic manipulation, with applications ranging from household tasks to industrial automation. One of the key challenges in this area is enabling robots to manipulate objects with skill and precision in real-world environments. Videos of demonstrations have proven to be an effective and intuitive way to teach robots complex manipulation tasks [1,2].
However, they often lack the fine-grained tactile feedback that humans use to adjust their grip and movement during object manipulation. In this project, we would like to investigate how we can better use tactile measurements from a two-finger gripper equipped with optical tactile sensors [3] to learn various manipulation tasks from demonstrations.
To this end, we aim to develop a method to incorporate fine-grained tactile and visual feedback into our haptic teleoperation setup and investigate their effectiveness with state-of-the-art imitation learning methods.
[1]: Zhao, Tony Z., et al. "Learning fine-grained bimanual manipulation with low-cost hardware." [2]: Chi, C., et al. Diffusion Policy: Visuomotor Policy Learning via Action Diffusion. [3] https://www.gelsight.com/gelsightmini/
- Literature review on imitation learning including diffusion models and transformers.
- Develop a method to feedback dense haptic measurements at integrate this in our existing teleoperation setup.
- Collect a tactile dataset (USB Stick Insertion, Cable Routing)
- Implement an imitation learning algorithm to learn from the collected data.
- (Optional) Extend the system to a two-hand setup and learn more complex tasks such as e.g. tying shoes
- Literature review on imitation learning including diffusion models and transformers. - Develop a method to feedback dense haptic measurements at integrate this in our existing teleoperation setup. - Collect a tactile dataset (USB Stick Insertion, Cable Routing) - Implement an imitation learning algorithm to learn from the collected data. - (Optional) Extend the system to a two-hand setup and learn more complex tasks such as e.g. tying shoes
- Highly motivated and autonomous student
- Programming experience in Python and PyTorch and Machine Learning
- (Preferred) Experience using ROS
- Highly motivated and autonomous student - Programming experience in Python and PyTorch and Machine Learning - (Preferred) Experience using ROS