Topic: Computational Vision

Research Projects

What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.

Embedding simulated prosthetic vision models in immersive virtual reality allows sighted subjects to act as virtual patients by “seeing” through the eyes of the patient.

Understanding the visual system in health and disease is a key issue for neuroscience and neuroengineering applications such as visual prostheses.

How are visual acuity and daily activities affected by visual impairment? Previous studies have shown that vision is altered and impaired in the presence of a scotoma, but the extent to which patient-specific factors affect vision and quality of life is not well understood.

pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.

How does the brain extract relevant visual features from the rich, dynamic visual input that typifies active exploration, and how does the neural representation of these features support visual navigation?

Researchers Interested in This Topic

Avatar
PBS

Anvitha Akkaraju
Honors Student

Avatar
CS

Nick Arenberg
Research Assistant

Avatar

Michael Beyeler
Assistant Professor

Avatar
CS

Ashley Bruce
MS Student

Avatar
CS

Alexander Chau
Research Assistant

Avatar
CS

Robert Gee
Research Assistant

Avatar
CS

Anand Giduthuri
Research Assistant

Avatar
CS

Jacob Granley
PhD Student

Avatar
CS

Elaine Ho
Research Assistant

Avatar
PBS

Yuchen Hou
Project Scientist

Avatar
CE

Rutvik Jha
Research Assistant

Avatar
PBS

Byron A. Johnson
PhD Student

Avatar
PBS

Wayne D. Johnson
BRAIN Scholar

Avatar
DYNS

Justin Kasowski
PhD Candidate

Avatar
CS

Anderson Liu
Research Assistant

Avatar
PBS

Ryan Neydavood
Junior Specialist

Avatar
CS

Lucas Relic
MS Student

Avatar
CS

Francie Wei
Research Assistant