Register now After registration you will be able to apply for this opportunity online.
Ultrasonic and Visual Sensor Fusion on Nano-drones
Autonomous nano-sized drones, with palm-sized form factor, are particularly well-suited for exploration in confined and cluttered environments. A pivotal requirement for exploration is visual-based perception and navigation. However, vision-based systems can fail in challenging conditions such as darkness, extreme brightness, fog, dust, or when facing transparent materials. In contrast, ultrasonic sensors provide reliable collision detection in these scenarios, making them a valuable complementary sensing modality. This project aims to develop a robust deep learning–based navigation system that fuses data from an ultrasonic sensor and a traditional frame-based camera to enhance obstacle avoidance capabilities.
Autonomous nano-sized drones, with palm-sized form factor, are particularly well-suited for exploration in confined and cluttered environments. A pivotal requirement for exploration is visual-based perception and navigation. However, vision-based systems can fail in challenging conditions such as darkness, extreme brightness, fog, dust, or when facing transparent materials. In contrast, ultrasonic sensors provide reliable collision detection in these scenarios, making them a valuable complementary sensing modality. This project aims to develop a robust deep learning–based navigation system that fuses data from an ultrasonic sensor and a traditional frame-based camera to enhance obstacle avoidance capabilities.
**Prerequisites**
Proficiency in Python and C programming. Background in Deep Learning. Experience in programming MicroControllers.
Autonomous nano-sized drones, with palm-sized form factor, are particularly well-suited for exploration in confined and cluttered environments. A pivotal requirement for exploration is visual-based perception and navigation. However, vision-based systems can fail in challenging conditions such as darkness, extreme brightness, fog, dust, or when facing transparent materials. In contrast, ultrasonic sensors provide reliable collision detection in these scenarios, making them a valuable complementary sensing modality. This project aims to develop a robust deep learning–based navigation system that fuses data from an ultrasonic sensor and a traditional frame-based camera to enhance obstacle avoidance capabilities.
**Prerequisites** Proficiency in Python and C programming. Background in Deep Learning. Experience in programming MicroControllers.
Develop a deep neural network (DNN) for obstacle avoidance on nano-drones by fusing traditional camera frames with ultrasonic sensor data. Optimization of the algorithm for the execution on resource-constrained MicroController Units. Collect and process real-world and/or simulated data to train and evaluate the model. Validate the system through in-field experiments, assessing the obstacle avoidance capabilities of a nano-drone.
Develop a deep neural network (DNN) for obstacle avoidance on nano-drones by fusing traditional camera frames with ultrasonic sensor data. Optimization of the algorithm for the execution on resource-constrained MicroController Units. Collect and process real-world and/or simulated data to train and evaluate the model. Validate the system through in-field experiments, assessing the obstacle avoidance capabilities of a nano-drone.
Interested candidates should send their CV, transcripts (bachelor and master), and descriptions of relevant projects to Lorenzo Lamberti [llamberti (at) iis (dot) ee (dot) ethz (dot) ch], Marco Cannici [cannici (at) ifi (dot) uzh (dot) ch], Nico Messikommer [nmessi (at) ifi (dot) uzh (dot) ch], Luca Benini [lbenini (at) iis (dot) ee (dot) ethz (dot) ch], and Davide Scaramuzza [sdavide (at) ifi (dot) uzh (dot) ch].
Interested candidates should send their CV, transcripts (bachelor and master), and descriptions of relevant projects to Lorenzo Lamberti [llamberti (at) iis (dot) ee (dot) ethz (dot) ch], Marco Cannici [cannici (at) ifi (dot) uzh (dot) ch], Nico Messikommer [nmessi (at) ifi (dot) uzh (dot) ch], Luca Benini [lbenini (at) iis (dot) ee (dot) ethz (dot) ch], and Davide Scaramuzza [sdavide (at) ifi (dot) uzh (dot) ch].