AI sees what you see: Researchers use eye reflections to reconstruct 3D environments

The results revealed a discernible reconstruction of the environment based on human eye reflections in a controlled setting. Additionally, the team experimented with a synthetic eye, which produced an even more impressive dreamlike scene. However, their attempt to model eye reflections from music videos featuring Miley Cyrus and Lady Gaga only yielded vague blobs, indicating the technology’s current distance from real-world use.

The researchers faced substantial challenges in reconstructing even crude and blurry scenes. The cornea, for instance, introduced inherent noise that made it difficult to separate the reflected light from the complex textures of the iris. To address this, they implemented cornea pose optimization to estimate the position and orientation of the cornea, as well as iris texture decomposition to extract unique features from individual irises during the training process. Additionally, they employed radial texture regularization loss, a machine learning technique that simulated smoother textures than the original material, further enhancing the isolated reflected scenery.