AI sees what you see: Researchers use eye reflections to reconstruct 3D environments

Researchers at the University of Maryland have made strides in transforming eye reflections into 3D scenes, albeit with limitations. Building upon Neural Radiance Fields (NeRF), an AI technology capable of reconstructing environments from 2D photos, the team delved into the possibility of extracting information about a person’s surroundings from subtle light reflections in their eyes. While this eye-reflection approach is far from practical applications, the study provides an intriguing glimpse into a technology that could one day unveil environments based on simple portrait photos.

To achieve their goal, the researchers utilized consecutive images captured from a single sensor, focusing on the reflections of light in human eyes. By analyzing high-resolution photos of a person looking towards a fixed camera position, they isolated and examined the eye reflections to determine where the eyes were directed in the images.