Laser Measurements Reveal Biological Basis Of Distance Perception
- Date:
- May 21, 2003
- Source:
- Duke University
- Summary:
- Using a laser range-finder, neurobiologists have scanned real-life scenes to gather millions of distance measurements to surfaces in each scene -- analyzing the mass of data to explain a series of long-known but little-understood quirks in how people judge distances.
- Share:
Using a laser range-finder, neurobiologists have scanned real-life scenes to gather millions of distance measurements to surfaces in each scene -- analyzing the mass of data to explain a series of long-known but little-understood quirks in how people judge distances.
The measurements reveal, for example, that the tendency of people to estimate the distance of isolated objects as being six to 12 feet away arises because that is the average distance of actual objects and surfaces in the visual scenes people encounter.
Thus, said the Duke University Medical Center neurobiologists, the findings support their theory that the visual system has evolved to make the best statistical guess about distances and other features of visual scenes, based on past experience.
The researchers, Zhiyong Yang and Dale Purves, published their findings in the June 2003 issue of "Nature Neuroscience" (online version posted May 18, 2003). Their research was supported by the National Institutes of Health and the Geller Endowment. Yang is a postdoctoral fellow and Purves is the George B. Geller Professor for Research in Neurobiology.
"All the characteristics of the visual world that we take for granted -- for example the diminution of size with distance -- are a result of perception," said Purves. "So, a question for centuries has been, 'What is the biological reason we see space in the peculiar way that we do?'"
In their past research on the perception of the geometry, color, brightness and motion, Purves and his colleagues have accumulated evidence that visual processing is not the result of logical calculations about the image that falls on the retina of the eye as such. Rather, they have shown that vision is a fundamentally empirical phenomenon, in which the connections between nerve cells in the visual system evolved as a result of the success of organisms that correctly interpreted the inherently ambiguous visual world. Such ambiguity arises because the photons striking the retina do not carry any inherent information about their origins; thus the visual system must process retinal information statistically to correctly interpret a visual world that can't be known directly.
In the "Nature Neuroscience" paper, Yang and Purves described how they used a laser range-finder -- which scans a scene and automatically records data about the distance to each of millions of point in a scene -- to scan about a hundred real scenes in the Duke Forest, on the Duke campus and inside campus buildings. These distance data were fed into a computer to analyze the distance to the multitude of points in the scenes.
Their purpose was to discover the natural basis for a number of inherent tendencies people exhibit when asked to judge distances. These tendencies include the "specific distance tendency," in which people shown an isolated object in a dark room with no other distance context judge that object to be six to 12 feet away.
Another tendency, called the "equidistance tendency," is for people shown two objects side by side in such circumstances to invariably judge them to be the same distance away. Other tendencies explored by Yang and Purves include the influence on distance estimates of objects due to eye-level, angle of sight and intervening ground surface structure, such as dips or humps.
"Researchers have known about these odd aspects of estimated distance for decades, but haven't found a good explanation for them," said Purves. "For the most part, people have just taken it for granted that these are the odd ways that we see things." Some researchers had used mathematical or geometrical theories to attempt to explain one or the other of these phenomena, but such approaches have failed, said Purves. The analysis by Yang of the massive amount of distance data on real world visual scenes, however, revealed that all of these tendencies reflect the statistics of the way objects are, on average, related to the observer.
"This evidence shows, for example, that our tendency to judge the distance of objects as being two to four meters away arises because that's really the average distance of surfaces when you use a laser to measure the actual distances of everything in a scene," said Purves. Similarly, he said, the statistical analyses of the scenes match the known influence on distance estimation of such parameters as the distance away of two objects in a scene, of the distance of objects at eye-level, at various angles of declination, and of objects separated from the observer by intervening ground structures.
"What we found so extraordinary is that the rationale for this whole set of basic distance judgment phenomena simply falls out of the measured distances that typically exist in the environment," said Purves. "All these peculiar perceptions reflect the probabilistic structure of the environment with respect to the distance from us of the points on object surfaces.
"Thus, for the first time, we're providing a biological basis to explain these phenomena," he said. "You see distances in this way because reflected light from points in space that project onto your retina is completely ambiguous with respect to how far away that point really is. The observer nonetheless has to solve the problem of what's out there. The visual system has evidently evolved to use the statistics of past experience to 'understand' what those distances are most likely to be, and that is what you see."
Story Source:
Materials provided by Duke University. Note: Content may be edited for style and length.
Cite This Page: