Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Tool Pose Estimation in Egocentric Videos
To support our research efforts in automated human behavior analysis we are looking for a motivated master student that is passionate about applying machine learning on real world datasets to do a master thesis / semester project on the topic of Tool Pose Estimation in Egocentric Videos.
To improve the support offered by mixed reality applications during surgical procedures, process monitoring systems can be utilized to recognize distinct activities performed by the surgeon. One of the most critical interactions to recognize are surgical tool interactions – a key element in any surgery that must be recognized correctly and robustly to infer the performed surgical workflow. The goal of this thesis is to develop an object pose estimation for surgical tools during spinal fusion surgery on egocentric videos.
To improve the support offered by mixed reality applications during surgical procedures, process monitoring systems can be utilized to recognize distinct activities performed by the surgeon. One of the most critical interactions to recognize are surgical tool interactions – a key element in any surgery that must be recognized correctly and robustly to infer the performed surgical workflow. The goal of this thesis is to develop an object pose estimation for surgical tools during spinal fusion surgery on egocentric videos.
The project will utilize recordings (video, head pose, hand pose) made with the Microsoft HoloLens 2 and consist of the following tasks:
1. Researching state-of-the-art object pose estimation and object tracking techniques
2. Implementing the proposed tool pose estimation using frameworks such as PyTorch, OpenCV, etc.
3. Benchmarking the implemented solution on different setups to expose performance under influence factors such as occlusion and illumination
The project will utilize recordings (video, head pose, hand pose) made with the Microsoft HoloLens 2 and consist of the following tasks:
1. Researching state-of-the-art object pose estimation and object tracking techniques 2. Implementing the proposed tool pose estimation using frameworks such as PyTorch, OpenCV, etc. 3. Benchmarking the implemented solution on different setups to expose performance under influence factors such as occlusion and illumination
- Good programming skills in Python (or Java, C#, C, C++) - Experience with machine (deep) learning and computer vision - Previous hands-on experience with frameworks such as PyTorch, OpenCV, scikit-learn - Methodical way of working - Ability to take ownership in shaping the direction of the project
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.
- Master Thesis - Semester Project - Autonomy in shaping the direction of the project
Please send your CV and master course grades to Sophokles Ktistakis (ktistaks@ethz.ch)
Please send your CV and master course grades to Sophokles Ktistakis (ktistaks@ethz.ch)