Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Accurate Shape Reconstruction using Tactile Sensors
Integrating tactile feedback into robotic grippers to enhance their capabilities for grasping, pick and place challenges or object identification has shown great potential. TACTFUL, an ETH internal collaboration between the Robotic Systems Lab, Integrated Sytems Laboratory and Micro- and Nanosystems group aims to progress in this field of research. We have developed a robotic skin that can provide precise tactile feedback at high spatial resolutions [1]. In this project, we explore how tactile information from robotic grippers can be useful for estimating contact geometry.
[1] J. Weichart, M. Ott, T. Burger and C. Hierold, "Towards Artificial Robotic Skin: Highly Sensitive Flexible Tactile Sensing Arrays with 3D Sensing Capabilities," 2022 IEEE 35th International Conference on Micro Electro Mechanical Systems Conference (MEMS), Tokyo, Japan, 2022, pp. 67-70, doi: 10.1109/MEMS51670.2022.9699826.
Keywords: Tactile Sensing, Deep Learning, Feature extraction/representation, Artificial robotic finger
Not specified
The aim of this project is to predict contact geometry, indentation depths and/or nodal displacements by using tactile sensor data. First, we collect raw capacitance data from the tactile sensors. As part of this process, novel probing routines will be devised for a 5-axis actuation platform to facilitate extensive data collection essential for training purposes. The ground truth data can be augmented from existing 3D Finite Element Method simulation methods. In the next phase, the acquired
experimental data will be utilized to establish a deep learning-based platform for accurately predicting nodal displacements from sensor readings. This will involve a comprehensive review and analysis of different image-based deep learning architectures for shape reconstruction [2], followed by the implementation of a training and evaluation pipeline for the collected data. Finally, we explore how multiple contact patches can be combined for complete object reconstruction.
The aim of this project is to predict contact geometry, indentation depths and/or nodal displacements by using tactile sensor data. First, we collect raw capacitance data from the tactile sensors. As part of this process, novel probing routines will be devised for a 5-axis actuation platform to facilitate extensive data collection essential for training purposes. The ground truth data can be augmented from existing 3D Finite Element Method simulation methods. In the next phase, the acquired
experimental data will be utilized to establish a deep learning-based platform for accurately predicting nodal displacements from sensor readings. This will involve a comprehensive review and analysis of different image-based deep learning architectures for shape reconstruction [2], followed by the implementation of a training and evaluation pipeline for the collected data. Finally, we explore how multiple contact patches can be combined for complete object reconstruction.