Many grasp synthesis methods detect grasps from discretized scene representations such as point clouds or voxel grids. One example is the Volumetric Grasping Network (VGN) [1]. However, to increase efficiency, the method works with a coarse discretization which can lead to reduced precision.
This project aims to replace the CNN backbone with a continuous implicit neural network representation, which promises to capture fine details even from noisy sensing [2]. The goal is to adapt the existing VGN framework accordingly and to compare grasping performance to state-of-the-art.
During this project the student will have the opportunity to contribute to an ongoing research code base and work with a robotic setup.
[1] Breyer et al. Volumetric Grasping Network: Real-time 6 DOF Grasp Detection in Clutter. CoRL 2020.
[2] Peng et al. Convolutional Occupancy Networks. ECCV 2020.
Many grasp synthesis methods detect grasps from discretized scene representations such as point clouds or voxel grids. One example is the Volumetric Grasping Network (VGN) [1]. However, to increase efficiency, the method works with a coarse discretization which can lead to reduced precision.
This project aims to replace the CNN backbone with a continuous implicit neural network representation, which promises to capture fine details even from noisy sensing [2]. The goal is to adapt the existing VGN framework accordingly and to compare grasping performance to state-of-the-art.
During this project the student will have the opportunity to contribute to an ongoing research code base and work with a robotic setup.
[1] Breyer et al. Volumetric Grasping Network: Real-time 6 DOF Grasp Detection in Clutter. CoRL 2020.
[2] Peng et al. Convolutional Occupancy Networks. ECCV 2020.
- Familiarize yourself with the existing VGN code base.
- Review literature on 3D mapping and grasp planning.
- Implement an implicit neural network representation backbone.
- Train and evaluate the whole pipeline.
- Conduct experiments in simulation and on a robot.
- Familiarize yourself with the existing VGN code base. - Review literature on 3D mapping and grasp planning. - Implement an implicit neural network representation backbone. - Train and evaluate the whole pipeline. - Conduct experiments in simulation and on a robot.
- Highly motivated and independent student.
- Strong programming skills in Python.
- Knowledge of computer vision and/or deep learning.
- Experience with Linux, ROS and Git is advantageous.
- Good academic record.
- Highly motivated and independent student. - Strong programming skills in Python. - Knowledge of computer vision and/or deep learning. - Experience with Linux, ROS and Git is advantageous. - Good academic record.
If you are interested in this project, send your transcripts and resume to Michel Breyer (mbreyer@ethz.ch) and Francesco Milano (francesco.milano@mavt.ethz.ch).
If you are interested in this project, send your transcripts and resume to Michel Breyer (mbreyer@ethz.ch) and Francesco Milano (francesco.milano@mavt.ethz.ch).