Topic: Visual Neuroprostheses

Researchers Interested in This Topic

Avatar

Michael Beyeler
Assistant Professor

Avatar
PBS

Juliana Chou
Lab Volunteer

Avatar
PBS

Isabella Gonzalez
Research Assistant

Avatar
CS

Jacob Granley
PhD Candidate

Avatar
PBS

Dariya (Dasha) Lobko
Lab Volunteer

Avatar
DYNS

Lucas Nadolskis
PhD Student

Avatar
CS

Galen Pogoncheff
PhD Student

Avatar
PBS

Sukhi Toor
Lab Volunteer

Avatar
PBS

Lily M. Turkstra
PhD Student

Avatar
CS

Apurv Varshney
PhD Student

Avatar
CS

Aiwen Xu
PhD Candidate

Research Projects

BionicVisionXR is an open-source virtual reality toolbox for simulated prosthetic vision that uses a psychophysically validated computational model to allow sighted participants to “see through the eyes” of a bionic eye recipient.

Rather than aiming to one day restore natural vision, we might be better off thinking about how to create practical and useful artificial vision now.

What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.

Rather than predicting perceptual distortions, one needs to solve the inverse problem: What is the best stimulus to generate a desired visual percept?

pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.