BionicVisionXR is an open-source virtual reality toolbox for simulated prosthetic vision that uses a psychophysically validated computational model to allow sighted participants to “see through the eyes” of a bionic eye recipient.
Juliana Chou
Lab Volunteer
Harshita Gangaswamy
Research Assistant
Lucas Gil Nadolskis
PhD Student
Isabella Gonzalez
Lab Volunteer
Jacob Granley
PhD Candidate
Justin Kasowski
PhD Candidate
Dariya (Dasha) Lobko
Lab Volunteer
Jennifer Phung
Lab Volunteer
Galen Pogoncheff
PhD Student
Ethan Roma
Research Assistant
Shivani Sista
Research Assistant
Sukhi Toor
Lab Volunteer
Lily M. Turkstra
PhD Student
Apurv Varshney
PhD Student
Aiwen Xu
PhD Candidate
BionicVisionXR is an open-source virtual reality toolbox for simulated prosthetic vision that uses a psychophysically validated computational model to allow sighted participants to “see through the eyes” of a bionic eye recipient.
Rather than aiming to one day restore natural vision, we might be better off thinking about how to create practical and useful artificial vision now.
What do visual prosthesis users see, and why? Clinical studies have shown that the vision provided by current devices differs substantially from normal sight.
Rather than predicting perceptual distortions, one needs to solve the inverse problem: What is the best stimulus to generate a desired visual percept?
pulse2percept is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.