Register now After registration you will be able to apply for this opportunity online.
Event-based feature detection for highly dynamic tracking
Event cameras are an exciting new technology enabling sensing of highly dynamic content over a broad range of illumination conditions. The present thesis explores novel, sparse, event-driven paradigms for detecting structure and motion patterns in raw event streams.
Event cameras are a relatively new, vision-based exteroceptive sensor relying on standard CMOS technology. Unlike normal cameras, event cameras do not measure absolute brightness in a frame-by-frame manner, but relative changes of the pixel-level brightness. Essentially, every pixel of an event camera independently observes the local brightness pattern, and when the latter experiences a relative change of minimum amount with respect to a previous value, a measurement is triggered in the form of a time-stamped event indicating the image location as well as the polarity of the change (brighter or darker) [2]. The pixels act asynchronously and can potentially fire events at a very high rate. Owing to their design, event cameras do not suffer from the same artifacts as regular cameras, but continue to perform well under high dynamics or challenging illumination conditions.
Event cameras currently enjoy growing popularity and they represent a new, interesting alternative for exteroceptive sensing in robotics when facing scenarios with high dynamics and/or challenging conditions. The focus of the present thesis lies on 3D motion estimation with event cameras, and in particular aims at event-driven, computationally efficient methods that can trigger motion hypotheses from sparse raw events. Initial theoretical advances in this direction have been presented in recent literature [3,4,5], though these methods are still limited in terms of the assumptions that they make. The present thesis will push the boundaries by proposing novel both geometry and learning-based representations.
The proposed thesis will be conducted at the Robotics and AI Institute, a new top-notch partner institute of Boston Dynamics pushing the boundaries of control and perception in robotics. Selection is highly competitive. Potential candidates are invited to submit their CV and grade sheet, after which students will be invited to an on-site interview.
[1] Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps, TPAMI 40(10):2402-2412, 2017
[2] The Silicon Retina, 264(5): 76-83, 1991
[3] A 5-Point Minimal Solver for Event Camera Relative Motion Estimation. In Proceedings of the International Conference on Computer Vision (ICCV), 2023
[4] An n-point linear solver for line and motion estimation with event cameras. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2024.
[5] Full-DoF Egomotion Estimation for Event Cameras Using Geometric Solvers, Arxiv: https://arxiv.org/html/2503.03307v1
Event cameras are a relatively new, vision-based exteroceptive sensor relying on standard CMOS technology. Unlike normal cameras, event cameras do not measure absolute brightness in a frame-by-frame manner, but relative changes of the pixel-level brightness. Essentially, every pixel of an event camera independently observes the local brightness pattern, and when the latter experiences a relative change of minimum amount with respect to a previous value, a measurement is triggered in the form of a time-stamped event indicating the image location as well as the polarity of the change (brighter or darker) [2]. The pixels act asynchronously and can potentially fire events at a very high rate. Owing to their design, event cameras do not suffer from the same artifacts as regular cameras, but continue to perform well under high dynamics or challenging illumination conditions.
Event cameras currently enjoy growing popularity and they represent a new, interesting alternative for exteroceptive sensing in robotics when facing scenarios with high dynamics and/or challenging conditions. The focus of the present thesis lies on 3D motion estimation with event cameras, and in particular aims at event-driven, computationally efficient methods that can trigger motion hypotheses from sparse raw events. Initial theoretical advances in this direction have been presented in recent literature [3,4,5], though these methods are still limited in terms of the assumptions that they make. The present thesis will push the boundaries by proposing novel both geometry and learning-based representations.
The proposed thesis will be conducted at the Robotics and AI Institute, a new top-notch partner institute of Boston Dynamics pushing the boundaries of control and perception in robotics. Selection is highly competitive. Potential candidates are invited to submit their CV and grade sheet, after which students will be invited to an on-site interview.
[1] Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps, TPAMI 40(10):2402-2412, 2017
[2] The Silicon Retina, 264(5): 76-83, 1991
[3] A 5-Point Minimal Solver for Event Camera Relative Motion Estimation. In Proceedings of the International Conference on Computer Vision (ICCV), 2023
[4] An n-point linear solver for line and motion estimation with event cameras. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2024.
[5] Full-DoF Egomotion Estimation for Event Cameras Using Geometric Solvers, Arxiv: https://arxiv.org/html/2503.03307v1
● Literature research
● Extend the mathematical foundation for sparse event-based motion estimation
● Propose novel detectors that extend operability from lines and constant velocity motion to full 6 DoF motion estimation from either points or lines, or other specific object trajectories such as ballistic curves
● Investigate learning-based, sparse event-based motion detectors to handle more general cases
● Apply the technology to real-world data to track fast ego-motion or ballistic object motion in the environment
● Literature research
● Extend the mathematical foundation for sparse event-based motion estimation
● Propose novel detectors that extend operability from lines and constant velocity motion to full 6 DoF motion estimation from either points or lines, or other specific object trajectories such as ballistic curves
● Investigate learning-based, sparse event-based motion detectors to handle more general cases
● Apply the technology to real-world data to track fast ego-motion or ballistic object motion in the environment
● Excellent knowledge of C++
● Computer vision experience
● Knowledge of geometric computer vision
● Plus: Experience with event cameras
● Excellent knowledge of C++
● Computer vision experience
● Knowledge of geometric computer vision
● Plus: Experience with event cameras
Laurent Kneip (lkneip@theaiinstitute.com)
Please include your CV and up-to-date transcript.
Laurent Kneip (lkneip@theaiinstitute.com)
Please include your CV and up-to-date transcript.