Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Mixed Reality Robotic Arm Controller
This project aims at creating a mixed reality (Hololens 2) based robotic arm controller, which allows the user to control interactions of the robot with its environment inside of the robot's digitalized 3D (holographic) environment.
Keywords: human-robot interaction, industrial cobot, remote training, hand tracking, robotics, cobot, robot controll, deep learning, machine learning, AI, mixed reality, augmented reality, digital
Collaborative robots (cobots) are designed to work safely in the immediate human environment and can therefore be very well integrated into industrial processes.
In order to train cobots efficiently on new tasks, it would be beneficial to learn new processes or variations in processes directly from a human worker.
This project aims at utilizing capabilities of the Hololens 2, such as hand tracking, hologram interactions, etc. to manouver the robot and record process sequences of object manipulations and then translate these into robot commands.
The project is part of a cooperation with Accenture Labs (Sophia Antipolis, France). The robotic arm controller will be developed at ETH Zurich and tested remotely in a duplicate setup in France, accessible in real-time remotely.
https://www.youtube.com/watch?v=qf7AtLVp15Q
Collaborative robots (cobots) are designed to work safely in the immediate human environment and can therefore be very well integrated into industrial processes.
In order to train cobots efficiently on new tasks, it would be beneficial to learn new processes or variations in processes directly from a human worker.
This project aims at utilizing capabilities of the Hololens 2, such as hand tracking, hologram interactions, etc. to manouver the robot and record process sequences of object manipulations and then translate these into robot commands.
The project is part of a cooperation with Accenture Labs (Sophia Antipolis, France). The robotic arm controller will be developed at ETH Zurich and tested remotely in a duplicate setup in France, accessible in real-time remotely.
https://www.youtube.com/watch?v=qf7AtLVp15Q
Main tasks include:
1. Researching state-of-the-art solutions
2. Familiarization with the Unity 3D game engine environment, Microsoft Hololens 2 and ROS
3. Setting up a demonstrator procedure with several physical objects in a lab environment
4. Designing and implementing features of the controller that solve different challenges that the robot faces during its operation.
5. Testing and validating the final workflow in a small case study.
Main tasks include:
1. Researching state-of-the-art solutions
2. Familiarization with the Unity 3D game engine environment, Microsoft Hololens 2 and ROS
3. Setting up a demonstrator procedure with several physical objects in a lab environment
4. Designing and implementing features of the controller that solve different challenges that the robot faces during its operation.
5. Testing and validating the final workflow in a small case study.
... a very autonomous and methodical way of working. You know how to structure a project, how to derive meaningful work packages and how to systematically develop solutions. ... solid programming skills in a common programming language (e.g., python, c#...). ... You are motivated to familiarize yourself with this fascinating topic of training robots based on human motion and intuitive gestures.
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.
- Franka Emika Panda collaborative robot (www.franka.de) - remote (teleoperated) cobot training - human motion to robot motion - thesis in direct cooperation with Accenture Labs (Digital Experiences team)
Please send your CV and masters grades to Sophokles Ktistakis (ktistaks@ethz.ch)
Please send your CV and masters grades to Sophokles Ktistakis (ktistaks@ethz.ch)