Event-based cameras, also known as neuromorphic vision sensors, capture visual information through asynchronous pixel-level brightness changes, offering high temporal resolution, low latency, and a wide dynamic range. These characteristics make them ideal for applications requiring rapid response times and efficient data processing. However, deploying deep learning models on resource-constrained devices remains challenging due to computational overhead and energy consumption. This project explores novel approaches to developing energy-efficient neural networks tailored for event-based vision tasks. By designing models that significantly reduce computational demands and memory footprint while maintaining high performance, we can make real-time processing on embedded hardware feasible. The focus will be on balancing training efficiency and model accuracy, minimizing energy consumption without sacrificing the quality of results.
Event-based cameras, also known as neuromorphic vision sensors, capture visual information through asynchronous pixel-level brightness changes, offering high temporal resolution, low latency, and a wide dynamic range. These characteristics make them ideal for applications requiring rapid response times and efficient data processing. However, deploying deep learning models on resource-constrained devices remains challenging due to computational overhead and energy consumption. This project explores novel approaches to developing energy-efficient neural networks tailored for event-based vision tasks. By designing models that significantly reduce computational demands and memory footprint while maintaining high performance, we can make real-time processing on embedded hardware feasible. The focus will be on balancing training efficiency and model accuracy, minimizing energy consumption without sacrificing the quality of results.
Investigate existing energy-efficient neural network architectures that can be applied to event-based vision. Design and implement energy-efficient neural networks specifically for event-based vision tasks. Explore techniques to optimize model architectures for efficiency without compromising accuracy. Test the developed models on benchmark event-based datasets, such as N-Caltech101, N-CARS, and Neuromorphic ImageNet.
Investigate existing energy-efficient neural network architectures that can be applied to event-based vision. Design and implement energy-efficient neural networks specifically for event-based vision tasks. Explore techniques to optimize model architectures for efficiency without compromising accuracy. Test the developed models on benchmark event-based datasets, such as N-Caltech101, N-CARS, and Neuromorphic ImageNet.
Nikola Zubic (zubic@ifi.uzh.ch), Marco Cannici (cannici@ifi.uzh.ch)
Nikola Zubic (zubic@ifi.uzh.ch), Marco Cannici (cannici@ifi.uzh.ch)