Event-Based Vision at the Edge

Neuromorphic event‐based vision sensors are poised to dramatically improve the latency, robustness and power in applications ranging from smart sensing to autonomous driving and assistive technologies for people who are blind.

Soon these sensors may power low vision aids and retinal implants, where the visual scene has to be processed quickly and efficiently before it is displayed. However, novel methods are needed to process the unconventional out-put of these sensors in order to unlock their potential.

Project Team

Project Lead:

Avatar

Open Position

MS/PhD Student or PostDoc

Project Affiliates:

Avatar

Open Position

Undergraduate Research Assistant (RA)

Avatar

Open Position

Undergraduate Research Assistant (RA)

Principal Investigator:

Michael Beyeler

Assistant Professor

Project Funding

Avatar

Faculty Research Grant: Event-based scene understanding for bionic vision
PI: Michael Beyeler (UCSB)

July 2021 - June 2022
Academic Senate
University of California, Santa Barbara (UCSB)

Publications

We present a cortical neural network model for visually guided navigation that has been embodied on a physical robot exploring a real-world environment. The model includes a rate based motion energy model for area V1, and a spiking neural network model for cortical area MT. The model generates a cortical representation of optic flow, determines the …

Back to all research projects