Register now After registration you will be able to apply for this opportunity online.
Fast, change-aware map-based camera tracking
Experiment with Gaussian Splatting based map representations for highly efficient camera tracking and simultaneous change detection and map updating. Apply to different exteroceptive sensing modalities.
Keywords: Localization, Camera Tracking, Gaussian Splatting, Change detection
Novel techniques for 3D environment representation such as NeRF [2] and Gaussian Splatting [1,3] provide the ability to efficiently render realistically looking images of environments, and-if formulated as a differentiable function of camera pose-can be embedded into a photometric loss in order to enable camera tracking across different modalities such as RGB cameras and event cameras. However, such representations are by default not able to accommodate for changes in the scene, which may happen in many practically relevant scenarios (e.g. domestic environment). Furthermore, in some of the relevant scenarios, the changes that occur over time may indeed be expected and according to plan (e.g. construction environment).
The present thesis looks into recent 3D reconstruction methods such as Gaussian Spatting and considers their use for real-time vision-based sensor tracking. However, rather than relying on a static map of the environment, the core of the method consists of incorporating a robust change detection and map updating mechanism that relies on a combination of measurement residuals and available priors. The final goal will be to enable long-term vision-based localization in gradually changing environments while simultaneously making use of new sensing data to update the map. The ultimate goal will be to extend the method to sensors that excel at highly dynamic motion tracking, but do not necessarily represent our first choice when thinking of a mapping device (i.e. event cameras).
The proposed thesis will be conducted at the Robotics and AI Institute, a new top-notch partner institute of Boston Dynamics pushing the boundaries of control and perception in robotics. Selection is highly competitive. Potential candidates are invited to submit their CV and grade sheet, after which students will be invited to an on-site interview.
[1] GS-EVT: Cross-Modal Event Camera Tracking based on Gaussian Splatting. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2025
[2] NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, 2020, Arxiv: https://arxiv.org/abs/2003.08934
[3] 3D Gaussian Splatting for Real-Time Radiance Field Rendering, 2023, Arxiv: https://arxiv.org/abs/2308.04079
Novel techniques for 3D environment representation such as NeRF [2] and Gaussian Splatting [1,3] provide the ability to efficiently render realistically looking images of environments, and-if formulated as a differentiable function of camera pose-can be embedded into a photometric loss in order to enable camera tracking across different modalities such as RGB cameras and event cameras. However, such representations are by default not able to accommodate for changes in the scene, which may happen in many practically relevant scenarios (e.g. domestic environment). Furthermore, in some of the relevant scenarios, the changes that occur over time may indeed be expected and according to plan (e.g. construction environment).
The present thesis looks into recent 3D reconstruction methods such as Gaussian Spatting and considers their use for real-time vision-based sensor tracking. However, rather than relying on a static map of the environment, the core of the method consists of incorporating a robust change detection and map updating mechanism that relies on a combination of measurement residuals and available priors. The final goal will be to enable long-term vision-based localization in gradually changing environments while simultaneously making use of new sensing data to update the map. The ultimate goal will be to extend the method to sensors that excel at highly dynamic motion tracking, but do not necessarily represent our first choice when thinking of a mapping device (i.e. event cameras).
The proposed thesis will be conducted at the Robotics and AI Institute, a new top-notch partner institute of Boston Dynamics pushing the boundaries of control and perception in robotics. Selection is highly competitive. Potential candidates are invited to submit their CV and grade sheet, after which students will be invited to an on-site interview.
[1] GS-EVT: Cross-Modal Event Camera Tracking based on Gaussian Splatting. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2025
[2] NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, 2020, Arxiv: https://arxiv.org/abs/2003.08934
[3] 3D Gaussian Splatting for Real-Time Radiance Field Rendering, 2023, Arxiv: https://arxiv.org/abs/2308.04079
● Literature research
● Creating a Gaussian Splatting representation of an environment and then using it for tracking
● Simultaneously ensuring continuous change detection in the environment. The change detection could rely on semantic detection priors in order to identify coherent image segments that have become inconsistent
● Propose an efficient map update strategy that relies on the introduced change detection and is derived from the original Gaussian Splatting algorithm
● Literature research
● Creating a Gaussian Splatting representation of an environment and then using it for tracking
● Simultaneously ensuring continuous change detection in the environment. The change detection could rely on semantic detection priors in order to identify coherent image segments that have become inconsistent
● Propose an efficient map update strategy that relies on the introduced change detection and is derived from the original Gaussian Splatting algorithm
● Excellent knowledge of Python
● Computer vision experience
● Knowledge of recent image rendering techniques
● Excellent knowledge of Python
● Computer vision experience
● Knowledge of recent image rendering techniques
Laurent Kneip (lkneip@theaiinstitute.com)
Igor Bogoslavskyi (ibogoslavskyi@theaiinstitute.com)
Please include your CV and up-to-date transcript.
Laurent Kneip (lkneip@theaiinstitute.com)
Igor Bogoslavskyi (ibogoslavskyi@theaiinstitute.com)
Please include your CV and up-to-date transcript.