Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
DL-based Human Activity Tracking for Understanding Pathogen Transmissions in Acute Healthcare Settings
To better understand the spread of bacteria and viruses in an acute care hospital, this project aims at combining deep learning-based methods, such as activity recognition and object detection to automatically collect and analyze data on all hand-to-surface exposures by doctors, nurses and patients.
Keywords: Deep Learning, Data Science, Activity Recognition, Object Detection, Sensor Fusion, Camera, IMU, Medical, Hospital
Healthcare-associated infections are a major threat to patient safety in acute healthcare around the globe. Still today, between 5 to 15% of patients acquire an infection during their hospital stay that was not present at their admission. Moreover, the spread of multi-resistant microorganisms between patients threatens to render antibiotics ineffective. Despite decennials of research in the field of infection prevention, the exact transmission pathways are still poorly understood.
This project aims at creating an automated pipeline to capture human interactions with hand-to-surface contacts during medical scenarios in order to establish maps of microbial bacterial transmission pathways.
For this project data from two stereo cameras (ZED 2) and a wrist-worn IMU (Galaxy Watch 4) will be fused and evaluated to detect hand contact of medical staff with objects and critical areas, such as instruments, patient-related areas, or monitoring equipment. For this, you will implement deep learning-based methods, such as activity recognition on body-skeleton and IMU data and object detection on the stereo cameras.
The goal is to create an automated pipeline that can be used to optimize medical procedures, room configurations and hygiene rules.
Healthcare-associated infections are a major threat to patient safety in acute healthcare around the globe. Still today, between 5 to 15% of patients acquire an infection during their hospital stay that was not present at their admission. Moreover, the spread of multi-resistant microorganisms between patients threatens to render antibiotics ineffective. Despite decennials of research in the field of infection prevention, the exact transmission pathways are still poorly understood.
This project aims at creating an automated pipeline to capture human interactions with hand-to-surface contacts during medical scenarios in order to establish maps of microbial bacterial transmission pathways. For this project data from two stereo cameras (ZED 2) and a wrist-worn IMU (Galaxy Watch 4) will be fused and evaluated to detect hand contact of medical staff with objects and critical areas, such as instruments, patient-related areas, or monitoring equipment. For this, you will implement deep learning-based methods, such as activity recognition on body-skeleton and IMU data and object detection on the stereo cameras. The goal is to create an automated pipeline that can be used to optimize medical procedures, room configurations and hygiene rules.
- Gaining familiarity with the sensor pipeline (ZED cameras and IMU sensor)
- Researching and implementing appropriate methods for activity recognition and object detection
- Fusing camera and IMU sensor data to get a robust representation of the scenery
- Combining activity recognition and object detection to get a complete understanding of the hand-object interactions
- Testing and validating your pipeline in a user-study with medical staff
- Gaining familiarity with the sensor pipeline (ZED cameras and IMU sensor) - Researching and implementing appropriate methods for activity recognition and object detection - Fusing camera and IMU sensor data to get a robust representation of the scenery - Combining activity recognition and object detection to get a complete understanding of the hand-object interactions - Testing and validating your pipeline in a user-study with medical staff
- Hands- on and pragmatic approach - Solid programing skills in Python - Familiarity with Pytorch and OpenCV
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation, or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from committing mistakes.
As part of our research at the AR Lab within the Human Behavior Group we are working on automatically analyzing a user’s interaction with his environment in scenarios such as surgery or in industrial machine interactions. By collecting real-world datasets during those scenarios and using them for machine learning tasks such as activity recognition, object pose estimation, or image segmentation we can gain an understanding of how a user performed during a given task. We can then utilize this information to provide the user with real-time feedback on his task using mixed reality devices, such as the Microsoft HoloLens, that can guide him and prevent him from committing mistakes.
- Master Thesis - Medical Data Science
Please send your CV and master course grades to Sophokles Ktistakis (ktistaks@ethz.ch)
Please send your CV and master course grades to Sophokles Ktistakis (ktistaks@ethz.ch)