Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Event-based vision for drones on Intel's neuromorphic processor Loihi-2
Event-based algorithms for motion segmentation have shown state-of-the-art results enabling fast drone maneuvers, e.g., dodging from fast moving obstacles or high-speed acrobatics. Neuromorphic processors have shown orders of magnitude advantages in energy efficiency and latency in a number of AI workloads, including adaptive control, constrained optimization problems, and event-based vision tasks. Loihi-2 is the latest neuromorphic research chip from Intel with the most advanced feature set that supports a large range of spiking neural network based algorithms. We aim to implement neuromorphic versions of event-based vision algorithms on Loihi-2, aiming to surpass the state-of-the-art performance in energy and/or latency
Keywords: Event cameras, neurmorphic processing
Event-based algorithms for motion segmentation have shown state-of-the-art results enabling fast drone maneuvers, e.g., dodging from fast-moving obstacles or high-speed acrobatics. Neuromorphic processors have shown orders of magnitude advantages in energy efficiency and latency in a number of AI workloads, including adaptive control, constrained optimization problems, and event-based vision tasks. Loihi-2 is the latest neuromorphic research chip from Intel with the most advanced feature set that supports a large range of spiking neural network based algorithms. We aim to implement neuromorphic versions of event-based vision algorithms on Loihi-2, aiming to surpass the state-of-the-art performance in energy and/or latency. We use and contribute to the open-source neuromorphic programming framework Lava (https://github.com/lava-nc).
Requirements: Strong experience in Python; knowledge of ROS, computer vision, and robotic simulation environments (e.g., Gazebo, Mujoco, Unity) is of advantage.
Event-based algorithms for motion segmentation have shown state-of-the-art results enabling fast drone maneuvers, e.g., dodging from fast-moving obstacles or high-speed acrobatics. Neuromorphic processors have shown orders of magnitude advantages in energy efficiency and latency in a number of AI workloads, including adaptive control, constrained optimization problems, and event-based vision tasks. Loihi-2 is the latest neuromorphic research chip from Intel with the most advanced feature set that supports a large range of spiking neural network based algorithms. We aim to implement neuromorphic versions of event-based vision algorithms on Loihi-2, aiming to surpass the state-of-the-art performance in energy and/or latency. We use and contribute to the open-source neuromorphic programming framework Lava (https://github.com/lava-nc).
Requirements: Strong experience in Python; knowledge of ROS, computer vision, and robotic simulation environments (e.g., Gazebo, Mujoco, Unity) is of advantage.
In this project, we will investigate a neuromorphic version of the event-based motion segmentation algorithm (https://rpg.ifi.uzh.ch/docs/CVPR18_Gallego.pdf) based on contrast maximization of time-space event clouds. This algorithm is the key component of many drone vision and navigation tasks. This project will be done in collaboration with Intel Munich and the Institute of Neuroinformatics.
In this project, we will investigate a neuromorphic version of the event-based motion segmentation algorithm (https://rpg.ifi.uzh.ch/docs/CVPR18_Gallego.pdf) based on contrast maximization of time-space event clouds. This algorithm is the key component of many drone vision and navigation tasks. This project will be done in collaboration with Intel Munich and the Institute of Neuroinformatics.
kiselev at ini.uzh.ch and yulia.sandamirskaya at intel.com
(Send CV and transcript of both bachelor and master degree)
kiselev at ini.uzh.ch and yulia.sandamirskaya at intel.com
(Send CV and transcript of both bachelor and master degree)