New! Sign up for our free email newsletter.
Science News
from research organizations

Bifocals in the brain

Date:
June 10, 2016
Source:
Universitaet Tübingen
Summary:
Seeing -- arguably our most important way of perceiving the world -- mostly happens without conscious intent. We see much better in the center of our visual field (along the visual axis) than in the periphery. So when our brain detects an object of interest in the periphery of our visual field, it immediately initiates an eye movement so our visual axis intersects with those objects. Once an object is in our direct line of sight, we can perceive it in far more depth and detail. Now researchers report that visual information from near and far space are processed with differing degrees of acuity.
Share:
FULL STORY

Neuroscientists from Tübingen have discovered how our brain processes visual stimuli above and below the horizon differently. The researchers led by Dr. Ziad Hafed of the Werner Reichardt Centre for Integrative Neuroscience (CIN) at the University of Tübingen investigated non-human primates, ascertaining that different parts of the visual field are represented asymmetrically in the superior colliculus, a brain structure central to visual perception and behavior. More neural tissue is assigned to the upper visual field than to the lower. As a result, visual stimuli above the horizon are processed sharper, stronger, and faster: our brain is wearing bifocals, so to speak.

Seeing -- arguably our most important way of perceiving the world -- mostly happens without conscious intent. We see much better in the center of our visual field (along the visual axis) than in the periphery. So when our brain detects an object of interest in the periphery of our visual field, it immediately initiates an eye movement so our visual axis intersects with those objects. Once an object is in our direct line of sight, we can perceive it in far more depth and detail.

This is partly due to the much greater density of photoreceptor cells in a very small area in the center of the retina -- the fovea. But the preference of visual perception for the center of our visual field is also represented in the brain: it is mirrored in those brain structures that process stimuli transmitted from the fovea. For instance, within the superior colliculus (SC) -- a midbrain area that initiates eye movements to peripheral stimuli directly based on input from the eyes -, much more neural tissue is dedicated to processing foveal signals than to processing peripheral signals. This phenomenon is called foveal magnification.

Now Dr. Hafed's team has shown that besides the fovea, other parts of the visual field are also ‚magnified' in the SC. Their findings reveal that the currently accepted model of the SC, which only accounts for foveal magnification, is not sufficient. This simple model in effect assumes that our SC looks at the world through a magnifying lens: The closer to the center of our visual field an object is, the more distinctively it is picked up by specific neurons, and the more such neurons are dedicated to processing it.

Dr. Hafed's new model modifies this image, adding upper visual field magnification on top of foveal magnification. His team has found that the upper half of the visual field is represented in the SC by receptive fields that are much smaller, more finely tuned to the spatial structure of received images, and more sensitive to image contrast. On the other hand, the lower visual field is represented at a lower resolution. Thus, Hafed's team thinks of the ‚lens' in the SC more as bifocal glasses.

To Dr. Hafed, the asymmetry in neural representation is adapted to our everyday environment. Far objects project smaller images on our retina than near objects. Therefore, processing images of near objects in a way that lets us react quickly and usefully to them needs less resolution than objects that are far away. 'In our three-dimensional environments, objects in the lower half of our visual field are usually close, a part of near space. An example would be the instruments in a car as we drive, which are low and close to us', Hafed explains. 'Meanwhile, objects in far space, such as an upcoming intersection, are viewed in the upper half of our visual field. In order to be able to focus precisely on objects that are far away, we intuitively need higher resolution in the upper visual field. Our experiments provide substantial evidence that the old model, with its symmetrical representation of upper and lower visual fields in the SC, needs to be rethought.'

The findings of Hafed's team may greatly benefit user-interface design in augmented reality (AR) or virtual reality (VR) systems. Their insight that the SC ‚bifocals' directly translate into faster and more accurate eye movements towards the upper visual field could help in strategic placement of essential feedback requiring rapid orienting. AR and VR systems feature large, immersive displays covering almost the entire visual field. In such scenarios, computer systems have tremendous freedom in the placement of essential user feedback, and optimizing such placement according to the 'human factor' currently represents a significant challenge for engineering.


Story Source:

Materials provided by Universitaet Tübingen. Note: Content may be edited for style and length.


Journal Reference:

  1. Ziad M. Hafed, Chih-Yang Chen. Sharper, Stronger, Faster Upper Visual Field Representation in Primate Superior Colliculus. Current Biology, June 2016 DOI: 10.1016/j.cub.2016.04.059

Cite This Page:

Universitaet Tübingen. "Bifocals in the brain." ScienceDaily. ScienceDaily, 10 June 2016. <www.sciencedaily.com/releases/2016/06/160610095032.htm>.
Universitaet Tübingen. (2016, June 10). Bifocals in the brain. ScienceDaily. Retrieved April 24, 2024 from www.sciencedaily.com/releases/2016/06/160610095032.htm
Universitaet Tübingen. "Bifocals in the brain." ScienceDaily. www.sciencedaily.com/releases/2016/06/160610095032.htm (accessed April 24, 2024).

Explore More

from ScienceDaily

RELATED STORIES