BionicVisionXR featured at Unite 2024

Unity Sentis highlights our real-time VR simulation of bionic eye technology, showcasing AI-driven models for visual stimulus encoding.

Our latest research on simulating prosthetic vision was highlighted at Unite 2024 during a presentation on Unity Sentis, Unity’s AI neural engine. The presentation showcased how Unity Sentis enables real-time execution of computationally expensive AI models within Unity Runtime.

Our project focuses on using neurophysiologically inspired and psychophysically validated models to simulate the visual experiences that could be generated by future bionic eye implants. These models are integrated into immersive virtual reality (VR) environments, updating in real time based on user head and eye movements. By leveraging Unity Sentis, we can run these models efficiently, allowing us to create realistic simulations of what individuals with prosthetic vision may experience.

Big thanks to Unity’s Bill Cullen and Alexandre Ribard for making this happen!

Avatar
Michael Beyeler
Assistant Professor

Michael Beyeler directs the Bionic Vision Lab at UC Santa Barbara, which is developing novel methods and algorithms to interface sight recovery technologies with the human visual system, with the ultimate goal of restoring useful vision to the blind.

All news articles