Research

Fig. 1: Degenerative retinal diseases cause irreversible vision loss in more than 10 million people worldwide. Analogous to cochlear implants, retinal prostheses electrically stimulate surviving retinal cells in order to evoke neuronal responses that are inter-preted by the brain as visual percepts (‘phosphenes’).

Towards a Smart Bionic Eye: Artificial Vision for People Who Are Blind

Today, over 10 million people worldwide are living with profound visual impairment, and retinal neuroprostheses (“bionic eye”, Fig. 1) are being developed to restore vision to these individuals. Analogous to cochlear implants, these devices electrically stimulate surviving retinal cells to evoke visual percepts (“phosphenes”). Existing devices generally provide an improved ability to localize high-contrast objects and perform basic orientation & mobility tasks.

However, the quality of current prosthetic vision is still rudimentary. A major outstanding challenge is translating electrode stimulation into a code that the brain can understand. Interactions between the device electronics and the retinal neurophysiology lead to distortions that can severely limit the quality of the generated visual experience.

Rather than aiming to one day restore natural vision (which may remain elusive until we fully understand the neural code of vision), we might be better off thinking about how to create practical and useful artificial vision now. We can already make things appear brighter the closer they get or use computer vision to highlight important objects in the scene. In the future, these visual augmentations could be combined with GPS to give directions, warn users of impending dangers in their immediate surroundings, or even extend the range of visible light with the use of an infrared sensor (think bionic night-time vision). Once the quality of the generated artificial vision reaches a certain threshold, there are a lot of exciting avenues to pursue.

Clinical studies have demonstrated that the vision provided by current SR devices differs substantially from normal sight.

Rather than predicting perceptual distortions, one needs to solve the inverse problem: What is the best stimulus to generate a desired visual percept?

Novel stimulation strategies can be tested on sighted subjects viewing a simulation of prosthetic vision in virtual/augmented reality.
  • Bionic Vision Lab
    • Brain Sciences

      Computational Neuroscience · ML/AI
      Visual Neuroscience
    • Vision Sciences

      Computer Vision · Psychophysics
      Low Vision · Sight Restoration
    • Brain-Computer Interfaces

      Neural Engineering · VR/AR/XR
      Assistive Technologies

Our group combines expertise across disciplines including computer science, computational neuroscience, and psychology. Joining us requires a specific mindset—realizing that we can’t all possibly know everything, but that everyone provides a specific piece to the puzzle (see below for our current openings).

Together we want to do science that matters.

Lab members

Principal Investigator

Avatar

Michael Beyeler

Assistant Professor

PhD Students

Avatar
DYNS

Justin Kasowski

PhD Student

Avatar
CS

Aiwen Xu

PhD Student

Avatar
CS

Jacob Granley

PhD Student

Avatar
PBS

Byron Johnson

PhD Student

MS Students

Avatar
CE

Ziming Qi

MS Student

Undergraduate Students

Avatar
CS

Nathan Wu

Honors Student

Avatar
PBS

Anvitha Akkaraju

Research Assistant

Avatar
PBS

Tanya Bhatia

Research Assistant

Avatar
ENV

Annika Brydon

Research Assistant

Avatar
CE

Rami Dabit

Research Assistant

Avatar
PBS

Yuchen Hou

Research Assistant

Avatar
PBS

Ananth Mahes

Research Assistant

Avatar
PBS

Rachel Mochizuki

Honors Student

Avatar
CS

Iris Moini-Nazeri

Research Assistant

Avatar
CE

Sairisheek Muttukuru

Research Assistant

Avatar
PBS

Ryan Neydavood

Research Assistant

Avatar
PBS, MATH

Bill Nguyen

Research Assistant

Avatar
PSTAT

Ruben Olmos

Research Assistant

Avatar
PBS

Fatima Qubadi

Research Assistant

Avatar
PBS

Angel Solares

Research Assistant

Avatar
CS

Yuval Steinhart

Research Assistant

Avatar
PSTAT

Shuyun Tang

Research Assistant

Avatar
MCDB

Archita Tharanipathy

Research Assistant

Avatar
CS

Kelly Yan

Research Assistant

Alumni

Avatar

Ethan Gao

Visiting Scholar
Ojai Valley School
(Summer 2020)

Avatar

Lu Han

Lab Volunteer
CE @ UCSB
(Spring 2021)

Avatar

Zuying Hu

MS Student
CS @ UCSB
(2020)

Avatar

Dylan Lin

Research Assistant
CS @ UCSB
(2020 - 2021)

Avatar

Ori Mizrahi

Research Assistant
CS @ UCSB
(2020)

Avatar

Kha Nguyen

Visiting Scholar
UC San Diego
(Summer 2020)

Avatar

Rashi Raghulan

Research Assistant
MCDB @ UCSB
(2019 - 2020)

Avatar

Versha Rohatgi

Visiting Scholar
Mountain View High
(Summer 2020)

Avatar

Yusong Yan

Lab Volunteer
CS @ UCSB
(Summer 2020)

Join Us

We are looking for curious and talented individuals who share our passion for bionic vision. If you are interested in joining us, check out our Lab Manual to familiarize yourself with our lab policies.

PhD Students

The deadline to join us in Fall 2021 has now passed. The next deadline will happen in December 2021.

Please know that we get a lot of emails from prospective PhD students. If you decide to contact Michael before applying to the program, you can make your application stand out by demonstrating that you have spent some time on our website and thought hard about why bionic vision is a good fit for your skills and interest.

CS Master's Students

If you are a CS/ECE Master’s student looking for a project, please contact Michael to set up a time to meet.

We are looking for students interested in applying their methodological skills to research problems in bionic vision. All students should have a solid programming background and strengths in one of the following:

  • Human-computer interaction: Build VR/AR/XR applications applied to low vision and bionic vision (Unity, compute shaders, image processing, bionic vision simulations, eye tracking)
  • Machine learning/data science: Build predictive models applied to real-world datasets collected on retinal prosthesis patients (classification, regression, time-series analysis, interpretable models, heterogeneous data).
  • Software engineering/parallel programming: Develop parallelization back ends for pulse2percept, our open-source Python-based simulation framework (Python, Cython, SciPy, OpenMP, GPGPU, JAX).

Undergraduate Students

We are completely full. Please check back in September 2021.

Contact