SiROP
Login   
Language
  • English
    • English
    • German
Home
Menu
  • Login
  • Register
  • Search Opportunity
  • Search Organization
  • Create project alert
Information
  • About SiROP
  • Team
  • Network
  • Partners
  • Imprint
  • Terms & conditions
Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.

Motion Tracking for Surgical Suturing using Wrist-Worn IMUs and RGB-D Sensors

Create a stable hand and tool motion tracker for surgical suturing by complementing existing tracking methods with wrist-worn IMU and external RGB-D sensors. The system will utilize pattern-analysis for surgical skill assessment.

Keywords: Computer vision, human pose estimation, tool pose estimation, mixed reality, augmented reality, AR, surgery, medical, surgical, deep learning

  • Surgical skill assessment is crucial for ensuring patient safety and outcomes by evaluating a surgeon's competence and proficiency in performing surgical procedures. In collaboration with the USZ Department of Cardiac Surgery, the goal of this project is to create an automatic skill-assessment system with the following components: 1. Hololens 2 for 3D hand tracking 2. External RGB-D camera and smartwatch-based IMU sensor to track hand and tool motions 3. Pattern-analysis on the captured motion data to find metrics for skill-assessment

    Surgical skill assessment is crucial for ensuring patient safety and outcomes by evaluating a surgeon's competence and proficiency in performing surgical procedures. In collaboration with the USZ Department of Cardiac Surgery, the goal of this project is to create an automatic skill-assessment system with the following components:

    1. Hololens 2 for 3D hand tracking

    2. External RGB-D camera and smartwatch-based IMU sensor to track hand and tool motions

    3. Pattern-analysis on the captured motion data to find metrics for skill-assessment

  • 1) Integrate the system components: Hololens 2, Smartwatch IMU and RGB-D camera 2) Implement computer vision based methods for hand and tool motion tracking 3) Analyze motion patterns and find metrics for skill assessment (Optional) Integrate with a visual front-end, example on the Hololens 2 or iPad

    1) Integrate the system components: Hololens 2, Smartwatch IMU and RGB-D camera

    2) Implement computer vision based methods for hand and tool motion tracking

    3) Analyze motion patterns and find metrics for skill assessment

    (Optional) Integrate with a visual front-end, example on the Hololens 2 or iPad

  • ... a very autonomous and methodical way of working. You know how to structure a project, how to derive meaningful work packages and how to systematically develop solutions.
    ... solid programming skills in a common programming language (e.g., python, c#...).
    ... prior experience with Computer Vision and Deep Learning (OpenCV, PyTorch, etc.)

  • As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.

    As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from doing mistakes.

  • - Master Thesis / Semester Thesis
    - Collabortion with USZ Department of Cardiac Surgery
    - Hand and Tool Motion Estimation
    - Computer Vision and Deep Learning

  • Please send your CV and masters grades to Rui Wang and Sophokles Ktistakis (ruiwang46@ethz.ch, ktistaks@ethz.ch)

    Please send your CV and masters grades to Rui Wang and Sophokles Ktistakis (ruiwang46@ethz.ch, ktistaks@ethz.ch)

Calendar

Earliest start2023-09-18
Latest endNo date

Location

pd|z Product Development Group Zurich (ETHZ)

Labels

Semester Project

Master Thesis

Topics

  • Information, Computing and Communication Sciences
SiROP PARTNER INSTITUTIONS