Event-Based Vision at the Edge

Neuromorphic event‐based vision sensors are poised to dramatically improve the latency, robustness and power in applications ranging from smart sensing to autonomous driving and assistive technologies for people who are blind.

Soon these sensors may power low vision aids and retinal implants, where the visual scene has to be processed quickly and efficiently before it is displayed. However, novel methods are needed to process the unconventional output of these sensors in order to unlock their potential.

Project Team

Principal Investigator:

Michael Beyeler

Assistant Professor

Project Funding

Avatar

Faculty Research Grant: Event-based scene understanding for bionic vision
PI: Michael Beyeler (UCSB)

July 2021 - June 2022
Academic Senate
University of California, Santa Barbara (UCSB)

Publications

We present a way to implement long short-term memory (LSTM) cells on spiking neuromorphic hardware.

We present a SNN model that uses spike-latency coding and winner-take-all inhibition to efficiently represent visual objects with as little as 15 spikes per neuron.

We present a SNN model that uses spike-latency coding and winner-take-all inhibition to efficiently represent visual stimuli from the Fashion MNIST dataset.

We present a cortical neural network model for visually guided navigation that has been embodied on a physical robot exploring a real-world environment. The model includes a rate based motion energy model for area V1, and a spiking neural network model for cortical area MT. The model generates a cortical representation of optic flow, determines the …

Back to all research projects