Register now After registration you will be able to apply for this opportunity online.
Cooperative SLAM for enhanced Smart-Glasses controlled robot guidance
Robotic automation represents mankind’s next leap. While the industry has already embraced robotization, robot driven domestic aid is only at the beginning of the revolution. Dealing with unconstrained daily environments is more challenging than automating manufacturing production lines. The growing diffusion of semi-autonomous assistive quadrupedal robots that can handle obstacles triggered the change. Nonetheless, robot control still requires active human supervision, which can be tedious in ordinary surroundings or even impossible in clinical circumstances.
Keywords: Robotics, Computer Vision, SLAM
This project is aimed at contributing to the development of a cooperative SLAM (Simultaneous Localization and Mapping) system for enhanced Smart-Glasses controlled robot guidance. The goal is to develop a system that allows a user to guide a robot through an environment using a pair of Smart-Glasses, thus allowing for hands free operation.
The main idea is to use the senors on the robot as well as the sensors on the Smart-Glasses to create a common map of the environment. Furthermore the Smart-Glasses are equipped with a gaze-tracking system. The final step is to use this gaze estimation and the map to guide the robot through the environment to a specific goal by looking at it, enabling persons with significant paraplegic diseases to control an assistive robot.
The final system should be tested in real world scenarios and analyzed in terms of accuracy, robustness and usability. A special focus must be put on extendability and scalability of the system, to allow for future addition of more robots, sensors and eventually a robotic arm.
** Prerequisites **
- C++ & Python
- computer vision
- ROS
** Character **
- 20% Literature study
- 60% Implementation/Development
- 20% Validation & Testing
This project is aimed at contributing to the development of a cooperative SLAM (Simultaneous Localization and Mapping) system for enhanced Smart-Glasses controlled robot guidance. The goal is to develop a system that allows a user to guide a robot through an environment using a pair of Smart-Glasses, thus allowing for hands free operation.
The main idea is to use the senors on the robot as well as the sensors on the Smart-Glasses to create a common map of the environment. Furthermore the Smart-Glasses are equipped with a gaze-tracking system. The final step is to use this gaze estimation and the map to guide the robot through the environment to a specific goal by looking at it, enabling persons with significant paraplegic diseases to control an assistive robot.
The final system should be tested in real world scenarios and analyzed in terms of accuracy, robustness and usability. A special focus must be put on extendability and scalability of the system, to allow for future addition of more robots, sensors and eventually a robotic arm.
** Prerequisites **
- C++ & Python
- computer vision
- ROS
** Character **
- 20% Literature study
- 60% Implementation/Development
- 20% Validation & Testing
** Project Tasks **
- Review existing cooperative SLAM algorithms
- Investigate strengths and weaknesses and choose the best one for implementation
- Adapt and modify the algorithm to incorporate gaze commands
- evaluate the final full pipeline on accuracy and efficiency
** Project Tasks **
- Review existing cooperative SLAM algorithms
- Investigate strengths and weaknesses and choose the best one for implementation
- Adapt and modify the algorithm to incorporate gaze commands
- evaluate the final full pipeline on accuracy and efficiency