Computational Vision

What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.

Embedding simulated prosthetic vision models in immersive virtual reality allows sighted subjects to act as virtual patients by “seeing” through the eyes of the patient.

Understanding the visual system in health and disease is a key issue for neuroscience and neuroengineering applications such as visual prostheses.

How are visual acuity and daily activities affected by visual impairment? Previous studies have shown that vision is altered and impaired in the presence of a scotoma, but the extent to which patient-specific factors affect vision and quality of life is not well understood.

pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.

How does the brain extract relevant visual features from the rich, dynamic visual input that typifies active exploration, and how does the neural representation of these features support visual navigation?