Register now After registration you will be able to apply for this opportunity online.
Real-Time Point Cloud Segmentation (MT)
Current 3D perception pipelines severely lack in accuracy and performance. Inherent noise in Point Cloud measurements, as well as occlusions starts one off with sub-par data. Additionally, there is very little annotated data available for direct Point cloud segmentation. Thus, workarounds have been tested, like depth projection of 2D segmentation masks [us, SAMPRO3D…]. However, they tend to be slow, because of the need of various views to reconstruct the scene, with additional cameras. Furthermore, they require previous semantic segmentation. Direct Point cloud segmentation has the potential to be much faster, since multiple view angles can easily be concatenated. However, they lack the right sizes and quality of datasets to build foundational models. Your task would thus be to finetune or create a Neural Network for Point cloud segmentation, as well as a dataset for supervised learning. For this, you can use our preexisting vision pipelines or data available online. To create annotations, we propose to automatically generate ground-truth labels with SAM-Pro 3D to keep manual labelling minimal.
We drive research in human-robot collaboration at PDZ
We drive research in human-robot collaboration at PDZ
Derive annotated Pointcloud Data from multiple view angles (using SAMPro3D or other methods.
Train or fintetune an existing model with the goal of rea-time pointcloud segmentation.
Derive annotated Pointcloud Data from multiple view angles (using SAMPro3D or other methods.
Train or fintetune an existing model with the goal of rea-time pointcloud segmentation.
• Knowledge of basic computer vision methods • Some prior experience with AI (Pytorch!) • Some knowledge of state-of the art AI architectures (as taught in “Machine Perception” for instance)
Not specified
Not specified
send your cv and transcript to Lucas Gimeno (gimenol@ethz.ch)
send your cv and transcript to Lucas Gimeno (gimenol@ethz.ch)