SiROP
Login   
Language
  • English
    • English
    • German
Home
Menu
  • Login
  • Register
  • Search Opportunity
  • Search Organization
  • Create project alert
Information
  • About SiROP
  • Team
  • Network
  • Partners
  • Imprint
  • Terms & conditions
Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.

Accurate SLAM for Human-Robot Teams

We extend the lamar.ethz.ch benchmark to develop accurate SLAM methods that can co-register drones, legged robots, wheeled robots, smartphones, and mixed reality headsets based on visual SLAM.

Keywords: SLAM, visual registration, benchmark

  • We extend the lamar.ethz.ch benchmark to develop accurate SLAM methods that can co-register drones, legged robots, wheeled robots, smartphones, and mixed reality headsets based on visual SLAM. Mixed-reality headsets and handheld devices offer the most intuitive interface to operate robots, i.e. giving them commands in 3D or checking what they plan to do next. Such human-robot teaming however requires that we can register the mixed-reality devices and the robots to the same 3D environment. Robots and Humans have quite a different viewpoint and different motion patterns, which makes this registration difficult. At the same time, nobody wants to wave their camera around for 5 minutes until they can operate the robot, so the registration needs to be possible from only a few images. As part of this project, you will operate wheeled, legged, or flying robots to collect data from buildings and labs at ETH. We then use the data to generate highly accurate ground-truth poses and measure the performance of existing SLAM algorithms to register the agents with respect to each other.

    We extend the lamar.ethz.ch benchmark to develop accurate SLAM methods that can co-register drones, legged robots, wheeled robots, smartphones, and mixed reality headsets based on visual SLAM.

    Mixed-reality headsets and handheld devices offer the most intuitive interface to operate robots, i.e. giving them commands in 3D or checking what they plan to do next. Such human-robot teaming however requires that we can register the mixed-reality devices and the robots to the same 3D environment. Robots and Humans have quite a different viewpoint and different motion patterns, which makes this registration difficult. At the same time, nobody wants to wave their camera around for 5 minutes until they can operate the robot, so the registration needs to be possible from only a few images.
    As part of this project, you will operate wheeled, legged, or flying robots to collect data from buildings and labs at ETH. We then use the data to generate highly accurate ground-truth poses and measure the performance of existing SLAM algorithms to register the agents with respect to each other.

  • Please send your CV and transcript to blumh@ethz.ch and zuria.bauer@inf.ethz.ch .

    Please send your CV and transcript to blumh@ethz.ch and zuria.bauer@inf.ethz.ch .

Calendar

Earliest start2023-08-17
Latest endNo date

Location

Computer Vision and Geometry Group (ETHZ)

Labels

Semester Project

Bachelor Thesis

Master Thesis

Topics

  • Information, Computing and Communication Sciences
SiROP PARTNER INSTITUTIONS