A nuanced understanding of the strategies that people who are blind or visually impaired employ to perform different instrumental activities of daily living (iADLs) is essential to the success of future visual accessibility aids.
Lucas Gil Nadolskis is currently a Graduate Student Researcher in the Bionic Vision Lab.
Lucas got his BS in Computer Science with a minor in Neuroscience from the University of Minnesota in 2021, where he performed research related to autonomous navigation, computer vision and brain-computer interfaces. Later, he got his MS in Biomedical Engineering from Carnegie Mellon University, where his primary work focused on analyzing top-down pathways of the visual system and how this could be integrated into cortical implants for the blind. In addition, he worked with the Human-Computer Interaction department on issues related to accessibility in data visualization, an area that he still seeks to explore in the future.
Starting Fall ‘23, Lucas will be a PhD student in the Interdepartmental Graduate Program in Dynamical Neuroscience (DYNS) at UC Santa Barbara, where he will investigate novel ways to approach cortical implants for the blind. Being blind himself since the age of five, Lucas’ interests are broad, ranging from neuroscience to accessibility, but can be summarized as efforts to improve the lives of blind people around the world.
Outside of the lab, most of his free time is occupied by music, traveling and searching for audio-described content.
MS in Computational Biomedical Engineering, 2023
Carnegie Mellon University, Pittsburgh, PA
BS in Computer Science, 2021
University of Minnesota-Twin Cities
A nuanced understanding of the strategies that people who are blind or visually impaired employ to perform different instrumental activities of daily living (iADLs) is essential to the success of future visual accessibility aids.
This research explores the integration of computer vision into various assistive devices, aiming to enhance urban navigation and environmental interaction for individuals who are blind or visually impaired.
We introduce VisionAI, a mobile application designed to enhance the in-store shopping experience for individuals with vision impairments.
Anika Arora, Lucas Nadolskis, Michael Beyeler, Misha Sra 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
Our interview study found a significant gap between researcher expectations and implantee experiences with visual prostheses, underscoring the importance of focusing future research on usability and real-world application.
Lucas Nadolskis, Lily M. Turkstra, Ebenezer Larnyo, Michael Beyeler Translational Vision Science & Technology (TVST)
(Note: LN and LMT contributed equally to this work.)
We present a series of analyses on the shared representations between evoked neural activity in the primary visual cortex of a blind human with an intracortical visual prosthesis, and latent visual representations computed in deep neural networks.
Jacob Granley, Galen Pogoncheff, Alfonso Rodil, Leili Soo, Lily M. Turkstra, Lucas Nadolskis, Arantxa Alfaro Saez, Cristina Soto Sanchez, Eduardo Fernandez Jover, Michael Beyeler Workshop on Representational Alignment (Re-Align), ICLR ‘24
(Note: JG and GP contributed equally to this work.)