Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
High-speed Augmented Realty with Event-based Cameras
The goal of this project is to achieve high-speed camera pose tracking of a single event-based camera to enable their application in virtual reality. The event-camera will detect and track features that are discovered on-the-fly from a a priori unknown scene.
Bio-inspired event-based cameras have a great potential for high-speed tracking in robotics applications. Conventional CMOS cameras retrieve full intensity frames at fixed rate, usually capturing a wealth of redundant information across consecutive frames. In contrast, each pixel in an event-based camera reacts asynchronously and only when the perceived intensity level changes, generating a so-called “event”.
The sparse stream of events generated by such cameras has a high temporal resolution and contains much less information than conventional visual images, potentially leading to speedups of several orders of magnitude in classical computer vision algorithms.
This project is framed in the context of virual reality, in which an accurate camera pose estimation is required to project virtual objects into the scene. Such an estimation is ill-posed using conventional frame-based cameras under high-speed motions due to motion blur. Event-based cameras have the potential of overcoming such problems processing only relevant changes in scene at a high temporal resolution.
Camera pose estimation can be robustly performed using known patterns that are present in the scene. Nonetheless, this project will be focused on estimating the pose with no a priori known pattern. Instead, the developed algorithm shall detect, estimate and learn the pattern on-the-fly, unlocking the possibility of applying such an algorithm to more challenging and complex scenarios.
The student taking this project needs to be highly motivated, with strong programming skills (C/C++ preferred) and experience in Computer Vision. Previous experience with Linux or ROS would be beneficial. The student will have the opportunity to work with a real setup and equipment offered by the Vision for Robotics Lab.
Bio-inspired event-based cameras have a great potential for high-speed tracking in robotics applications. Conventional CMOS cameras retrieve full intensity frames at fixed rate, usually capturing a wealth of redundant information across consecutive frames. In contrast, each pixel in an event-based camera reacts asynchronously and only when the perceived intensity level changes, generating a so-called “event”.
The sparse stream of events generated by such cameras has a high temporal resolution and contains much less information than conventional visual images, potentially leading to speedups of several orders of magnitude in classical computer vision algorithms.
This project is framed in the context of virual reality, in which an accurate camera pose estimation is required to project virtual objects into the scene. Such an estimation is ill-posed using conventional frame-based cameras under high-speed motions due to motion blur. Event-based cameras have the potential of overcoming such problems processing only relevant changes in scene at a high temporal resolution.
Camera pose estimation can be robustly performed using known patterns that are present in the scene. Nonetheless, this project will be focused on estimating the pose with no a priori known pattern. Instead, the developed algorithm shall detect, estimate and learn the pattern on-the-fly, unlocking the possibility of applying such an algorithm to more challenging and complex scenarios.
The student taking this project needs to be highly motivated, with strong programming skills (C/C++ preferred) and experience in Computer Vision. Previous experience with Linux or ROS would be beneficial. The student will have the opportunity to work with a real setup and equipment offered by the Vision for Robotics Lab.
- WP1: Research into existing works regarding event-based camera pose estimation, pattern detection and tracking.
- WP2: Implementation of a module for event-based camera pose estimation using a known pattern.
- WP3: Development of a module for online event-based detection and learning of distinctive patterns.
- WP4: Implementation of a module for camera-pose estimation using while tracking and learning distinctive patterns on-the-fly.
- WP5: Experimentation and evaluation of the system’s performance under different conditions (e.g. lightning, motion) and arrival to conclusions.
- WP1: Research into existing works regarding event-based camera pose estimation, pattern detection and tracking. - WP2: Implementation of a module for event-based camera pose estimation using a known pattern. - WP3: Development of a module for online event-based detection and learning of distinctive patterns. - WP4: Implementation of a module for camera-pose estimation using while tracking and learning distinctive patterns on-the-fly. - WP5: Experimentation and evaluation of the system’s performance under different conditions (e.g. lightning, motion) and arrival to conclusions.
Not specified
Interested students, please contact Ignacio Alzugaray ( aignacio@student.ethz.ch ) with Marco Karrer ( karrerm@student.ethz.ch ) in CC.
Interested students, please contact Ignacio Alzugaray ( aignacio@student.ethz.ch ) with Marco Karrer ( karrerm@student.ethz.ch ) in CC.