**Master's/Bachelor's Thesis Project**
Biomimetic robotic hands have the same size and proportions as human hands, and are hoped to replace many repetitive tedious tasks that cannot be automated with today’s robots. One method to teach these robots how to perform dexterous tasks is from web-scale videos of human demonstrations. In this work, you will work to achieve zero- or few-shot transfer of manipulation skills from human demonstrations based on pre-training from large scale human hand motion videos, working together with researchers from the ETH AI Center, the Institute of Neuroinformatics and the Soft Robotics Lab. We provide our biomimetic tendon-driven robotic hand as the hardware platform for the visual control algorithm.
We are looking for a motivated Master's or Bachelor's student, preferably from a Computer Science or Data Science background, willing to work full time for a thesis. You should have a strong coding and machine learning background.
**References**
1. Shaw, Kenneth, Shikhar Bahl, and Deepak Pathak. "VideoDex: Learning Dexterity from Internet Videos." arXiv preprint arXiv:2212.04498 (2022).
2. Brohan, Anthony, et al. "Rt-1: Robotics transformer for real-world control at scale." arXiv preprint arXiv:2212.06817 (2022).
**Master's/Bachelor's Thesis Project**
Biomimetic robotic hands have the same size and proportions as human hands, and are hoped to replace many repetitive tedious tasks that cannot be automated with today’s robots. One method to teach these robots how to perform dexterous tasks is from web-scale videos of human demonstrations. In this work, you will work to achieve zero- or few-shot transfer of manipulation skills from human demonstrations based on pre-training from large scale human hand motion videos, working together with researchers from the ETH AI Center, the Institute of Neuroinformatics and the Soft Robotics Lab. We provide our biomimetic tendon-driven robotic hand as the hardware platform for the visual control algorithm.
We are looking for a motivated Master's or Bachelor's student, preferably from a Computer Science or Data Science background, willing to work full time for a thesis. You should have a strong coding and machine learning background.
**References**
1. Shaw, Kenneth, Shikhar Bahl, and Deepak Pathak. "VideoDex: Learning Dexterity from Internet Videos." arXiv preprint arXiv:2212.04498 (2022).
2. Brohan, Anthony, et al. "Rt-1: Robotics transformer for real-world control at scale." arXiv preprint arXiv:2212.06817 (2022).
We aim to train a transformer generative model of human hand motion, using human demonstration data as a prior and robotic demonstrations as fine-tuning data.
We aim to train a transformer generative model of human hand motion, using human demonstration data as a prior and robotic demonstrations as fine-tuning data.
Elvis Nava, elvis.nava@ai.ethz.ch, ETH AI Center
Prof. Robert Katzschmann, rkk@ethz.ch, Inst. of Robotics and Intelligent Systems, D-MAVT
Submit **a short motivational statement, your CV, your transcripts, and two reference contacts**.
Elvis Nava, elvis.nava@ai.ethz.ch, ETH AI Center
Prof. Robert Katzschmann, rkk@ethz.ch, Inst. of Robotics and Intelligent Systems, D-MAVT
Submit **a short motivational statement, your CV, your transcripts, and two reference contacts**.