New! Sign up for our free email newsletter.
Science News
from research organizations

Mountain splendor? Scientists know where your eyes will look

Date:
December 4, 2018
Source:
Yale University
Summary:
Using precise brain measurements, researchers predicted how people's eyes move when viewing natural scenes, an advance in understanding the human visual system that can improve a host of artificial intelligence efforts, such as the development of driverless cars.
Share:
FULL STORY

Using precise brain measurements, Yale researchers predicted how people's eyes move when viewing natural scenes, an advance in understanding the human visual system that can improve a host of artificial intelligence efforts, such as the development of driverless cars, said the researchers.

"We are visual beings and knowing how the brain rapidly computes where to look is fundamentally important," said Yale's Marvin Chun, Richard M. Colgate Professor of Psychology, professor of neuroscience and co-author of new research published Dec. 4 in the journal Nature Communications.

Eye movements have been extensively studied, and researchers can tell with some certainty where a gaze will be directed at different elements in the environment. What hasn't been understood is how the brain orchestrates this ability, which is so fundamental to survival.

In a previous example of "mind reading," Chun's group successfully reconstructed facial images viewed while people were scanned in an MRI machine, based on their brain imaging data alone.

In the new paper, Chun and lead author Thomas P. O'Connell took a similar approach and showed that by analyzing the brain responses to complex, natural scenes, they could predict where people would direct their attention and gaze. This was made possible by analyzing the brain data with deep convolutional neural networks -- models that are extensively used in artificial intelligence (AI).

"The work represents a perfect marriage of neuroscience and data science," Chun said.

The findings have a myriad of potential applications -- such as testing competing artificial intelligence systems that categorize images and guide driverless cars.

"People can see better than AI systems can," Chun said. "Understanding how the brain performs its complex calculations is an ultimate goal of neuroscience and benefits AI efforts."

 


Story Source:

Materials provided by Yale University. Original written by Bill Hathaway. Note: Content may be edited for style and length.


Journal Reference:

  1. Thomas P. O’Connell, Marvin M. Chun. Predicting eye movement patterns from fMRI responses to natural scenes. Nature Communications, 2018; 9 (1) DOI: 10.1038/s41467-018-07471-9

Cite This Page:

Yale University. "Mountain splendor? Scientists know where your eyes will look." ScienceDaily. ScienceDaily, 4 December 2018. <www.sciencedaily.com/releases/2018/12/181204090345.htm>.
Yale University. (2018, December 4). Mountain splendor? Scientists know where your eyes will look. ScienceDaily. Retrieved April 18, 2024 from www.sciencedaily.com/releases/2018/12/181204090345.htm
Yale University. "Mountain splendor? Scientists know where your eyes will look." ScienceDaily. www.sciencedaily.com/releases/2018/12/181204090345.htm (accessed April 18, 2024).

Explore More

from ScienceDaily

RELATED STORIES