Topic: Visual Prosthesis

Researchers Interested in This Topic

Avatar

Michael Beyeler
Assistant Professor

Avatar
CS

Jacob Granley
PhD Candidate

Avatar
CS

Yuchen Hou
PhD Student

Avatar
PBS

Byron A. Johnson
PhD Candidate

Avatar
DYNS

Lucas Nadolskis
PhD Student

Avatar
CS

Galen Pogoncheff
PhD Student

Avatar
PBS

Lily M. Turkstra
PhD Student

Avatar
CS

Apurv Varshney
PhD Student

Research Projects

BionicVisionXR is an open-source virtual reality toolbox for simulated prosthetic vision that uses a psychophysically validated computational model to allow sighted participants to “see through the eyes” of a bionic eye recipient.

Rather than aiming to one day restore natural vision, we might be better off thinking about how to create practical and useful artificial vision now.

Rather than predicting perceptual distortions, one needs to solve the inverse problem: What is the best stimulus to generate a desired visual percept?

What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.

pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.