We propose to embed biologically realistic models of simulated prosthetic vision in immersive virtual reality so that sighted subjects can act as 'virtual patients' in real-world tasks.
We combined deep learning-based scene simplification strategies with a psychophysically validated computational model of the retina to generate realistic predictions of simulated prosthetic vision.
We systematically explored the space of possible implant configurations to make recommendations for optimal intraocular positioning of Argus II.
We show that the perceptual experience of retinal implant users can be accurately predicted using a computational model that simulates each individual patient’s retinal ganglion axon pathways.
To investigate the effect of axonal stimulation on the retinal response, we developed a computational model of a small population of morphologically and biophysically detailed retinal ganglion cells, and simulated their response to epiretinal electrical stimulation. We found that activation thresholds of ganglion cell somas and axons varied systematically with both stimulus pulse duration and electrode-retina distance. These findings have important implications for the improvement of stimulus encoding methods for epiretinal prostheses.
The goal of this review is to summarize the vast basic science literature on developmental and adult cortical plasticity with an emphasis on how this literature might relate to the field of prosthetic vision.
*pulse2percept* is an open-source Python simulation framework used to predict the perceptual experience of retinal prosthesis patients across a wide range of implant configurations.