New! Sign up for our free email newsletter.
Science News
from research organizations

Visual Research Seeks To Cut Through Clutter; Rutgers Work May Benefit Baggage Screening

Date:
January 18, 2006
Source:
Rutgers, the State University of New Jersey
Summary:
Think you have problems finding your car keys in a messy drawer? Just imagine the difficulty faced by airport baggage checkers search for weapons amid a jumbled clutter of different textures and colors. A researcher at Rutgers University--Camden has done just that, and offers insight on how to cut through the visual clutter to more effectively identify distinct items, such as hidden weapons.
Share:
FULL STORY

Think you have problems finding your car keys in a messy drawer? Just imagine the difficulty faced by airport baggage checkers search for weapons amid a jumbled clutter of different textures and colors.

A researcher at Rutgers University--Camden has done just that, and offers insight on how to cut through the visual clutter to more effectively identify distinct items, such as hidden weapons.

"Weapons can come in any form," says Mary Bravo, an assistant professor of psychology at Rutgers University-Camden. "Baggage checkers don't know exactly what they're looking for."

Funded by the National Science Foundation and published in the journals Perception and Vision Research, Bravo's research seeks to clarify how people see, interpret, and identify objects.

While many previous vision studies considered how people recognize objects in such unlikely simple settings as one object on a plain background, Bravo's work acknowledges that real life is rarely uncluttered. For example, very few people would pack a gun in an empty suitcase. So the Rutgers-Camden researcher opted to test how human vision finds objects set in clutter.

In one study, Bravo asked participants to find food amidst a clutter of objects. No specific type of food was identified, increasing the difficulty of the search.

The food was sought in two settings: Each had the same number of objects, yet the objects in one setting were arranged more densely than the other. Bravo found that the density of the arrangement didn't affect the search if the objects were simple, such as a comb or sock. If the objects were more complex, such as a set of keys, then increasing the density of the arrangement increased the difficulty of the search task.

"When the objects were presented in a sparse array, search times to find the target were similar for displays composed of simple and complex objects. But when the same objects were presented as dense clutter, search functions were steeper for displays composed of complex objects," Bravo says.

The study also gave insight on what we look for when we search. "While search rates in a sparse display are determined by the number of objects, search rates in clutter are also affected by the number of object parts," Bravo says.

"In sparse displays, people process whole objects. In clutter displays, people process object parts," adds Bravo, who offers an example: In a sparse display, people might recognize a pineapple because of its characteristic shape -- oblong with a bushy top. In clutter, this recognition is more likely to be based on the pineapple's distinctive texture. "this is true even if the pineapple is completely visible," notes the Rutgers-Camden scholar. "It's the presence of the cluttered background that causes the problem."

"People think that you search based on objects," says Bravo. "You think you're seeing everything, but you're really seeing very, very little." The theory that observers seek and find an item amidst clutter based on distinctive parts is leading Bravo and her research team to develop a computer program that will help to predict which parts will be most distinctive to the searcher.

In another study, she asked people to identify pieces of objects. Even though the portions are small -- the screw in scissors, a piece of a hammer handle -- participants were able to identify the object to which those parts belonged.

She then advanced that research by morphing pictures of different objects into unlikely creations -- such as a lampshade with a light bulb base, a pipe handle with a horn top -- and then asking people to find an animal amidst the clutter.

"This didn't present a problem to those we tested," she says. "Most didn't even notice that the lamp shade had a light bulb base. They just looked for the animal."

What does this mean for the airport baggage screener searching for guns or knives? The complexity, and not the familiarity, of the objects in the suitcase will determine the difficulty of their job. Moreover, the screener will be able to identify an offending object -- such as a gun -- simply by seeing a portion of the item, such as the handle.

Baggage screeners may not need to study pictures of weapons and of objects commonly found in suitcases, speculates Bravo. In fact, she suggests that the only way to improve performance in such complicated searches is to practice searching in clutter.

Bravo regularly teaches courses in Experimental Psychology and Physiological Psychology at Rutgers-Camden. A resident of Cherry Hill, she received her doctoral degree in neurobiology from Northwestern University.


Story Source:

Materials provided by Rutgers, the State University of New Jersey. Note: Content may be edited for style and length.


Cite This Page:

Rutgers, the State University of New Jersey. "Visual Research Seeks To Cut Through Clutter; Rutgers Work May Benefit Baggage Screening." ScienceDaily. ScienceDaily, 18 January 2006. <www.sciencedaily.com/releases/2006/01/060118091516.htm>.
Rutgers, the State University of New Jersey. (2006, January 18). Visual Research Seeks To Cut Through Clutter; Rutgers Work May Benefit Baggage Screening. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2006/01/060118091516.htm
Rutgers, the State University of New Jersey. "Visual Research Seeks To Cut Through Clutter; Rutgers Work May Benefit Baggage Screening." ScienceDaily. www.sciencedaily.com/releases/2006/01/060118091516.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES