Topic: Computational Vision

Researchers Interested in This Topic

Avatar

Michael Beyeler
Assistant Professor

Avatar
CS

Jacob Granley
PhD Candidate

Avatar
CS

Lixing (Leo) Guo
Research Assistant

Avatar
CS

Yuchen Hou
PhD Student

Avatar
PBS

Byron A. Johnson
PhD Candidate

Avatar
CS

Sangita Kunapuli
Research Assistant

Avatar
CS

Anderson Liu
Student Assistant

Avatar
CS

Galen Pogoncheff
PhD Student

Avatar
CS

Adyah Rastogi
Research Assistant

Avatar
CS

Callie Sardina
MS Student

Avatar
CCS

Eirini Schoinas
Research Assistant

Avatar
CS

Ivy Wang
DIMAP Student

Research Projects

What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.

Embedding simulated prosthetic vision models in immersive virtual reality allows sighted subjects to act as virtual patients by “seeing” through the eyes of the patient.

Understanding the visual system in health and disease is a key issue for neuroscience and neuroengineering applications such as visual prostheses.

How are visual acuity and daily activities affected by visual impairment? Previous studies have shown that vision is altered and impaired in the presence of a scotoma, but the extent to which patient-specific factors affect vision and quality of life is not well understood.

pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.

How does the brain extract relevant visual features from the rich, dynamic visual input that typifies active exploration, and how does the neural representation of these features support visual navigation?