Our latest research on simulating prosthetic vision was highlighted at Unite 2024 during a presentation on Unity Sentis, Unity’s AI neural engine. The presentation showcased how Unity Sentis enables real-time execution of computationally expensive AI models within Unity Runtime.
Our project focuses on using neurophysiologically inspired and psychophysically validated models to simulate the visual experiences that could be generated by future bionic eye implants. These models are integrated into immersive virtual reality (VR) environments, updating in real time based on user head and eye movements. By leveraging Unity Sentis, we can run these models efficiently, allowing us to create realistic simulations of what individuals with prosthetic vision may experience.
Big thanks to Unity’s Bill Cullen and Alexandre Ribard for making this happen!