Jan. 9, 2003 For a fighter pilot, flagging attention could bring crashing consequences. The same holds true for air-traffic controllers, for airport-security personnel and for other industries as well.
And lapses in attention are always likely because vigilance is hard work for the human brain. But the drain on the brain can be reduced and performance enhanced by prompting attention, according to research by a team of University of Cincinnati and Catholic University of America psychologists.
The research, performed at the University of Cincinnati which houses the nation's largest vigilance testing laboratory, was just published in the January 2003 issue of the quarterly journal, Theoretical Issues in Ergonomic Science.
In the project, the researchers tested to see if warning cues helped subjects to more effectively complete vigilance tasks. Said UC psychology professor Joel Warm, "We've known since World War II that missed signals from declining vigilance is a serious problem with workers. It's not that workers are not motivated. The navy personnel searching for submarines during World War II were very motivated. Yet, they still experienced sharp declines in attention and missed signals that meant lost lives.
"We have the same challenges today as more and more human work involves automation, the supervised control of machines, whether the setting is an air-traffic control tower, a power plant, a long-distance truck or train, or a fighter jet. People think vigilance, the looking at a screen or other mechanical processes, isn't 'doing much.' Actually, it's a great deal of work, and it's very stressful." The study showed that one way to increase vigilance success is to simply provide cues that a critical signal is coming. In fact, when subjects consistently received reliable cues prior to an important signal, vigilance and performance remained consistently high during a 40-minute "shift" in a test that mimicked an air-traffic control display. Critical signals for detection, consisting of planes traveling on a collision course, were mixed with non-critical signals, consisting of planes traveling on non-collision courses.
Subjects were divided into four groups. The first group received cues that were 100-percent reliable. In other words, they were told they would be forewarned, and that each cue of "look" would be followed by a critical signal. The second group was given cues that were 80-percent reliable (and were likewise told their warnings were 80-percent reliable). The third group was given cues that were 40-percent reliable (and, again, were told that their cues were 40-percent reliable). A final group was given no warnings.
At first, both those who received cues to expect an important signal and those who did not receive any cues performed in a similarly effective manner. However, the performance of non-cued subjects declined considerably over time while performance efficiency remained stable (and high) over time for the "100-percent" group. By the end of the session, efficiency was clearly best in the "100-percent" group followed in order by the "80-percent," "40-percent" and then the "no-cue" groups.
The research has practical ramifications for the use of cueing in many automated systems designed to forewarn monitors of impending, untoward events in the work world, including prompts for combat pilots to take notice of an enemy weapon system's radar lock or of equipment problems in the aircraft. Given the consistent finding that cueing benefits vigilance performance, Warm said that "one might take the position in designing operational cueing systems that some cueing is better than none; however, the reliability of operational cueing systems must be high. Otherwise, they may offer no real benefit while incurring considerable cost in sustained attention."
Cuing, already used in aviation and at power plants, reduces the workload of surveillance and of information processing. As such, cuing could be expanded to other applications; however, it must be very reliable to be effective in reducing brain strain.
Along with detection efficiency in this study, blood flow to the brain likewise correlated to the amount of warnings given. At the start of the study, blood flow to the brain was similar for all groups. However, it declined at different rates over time, depending on the warnings given. So, by the end of the vigil, blood flow was clearly highest in the "100-percent" group followed in order by the "80-percent," "40-percent" and "no-cue" groups. This leads to the conclusion that performance-related changes and blood flow to the brain may share common energy mechanisms. However, Warm warned that it remains to be seen what, precisely, those mechanisms consist of.
The UC vigilance lab seeks to develop and test means for increasing effectiveness, productivity and safety in the many applications in which sustained attention plays a role. For instance, vigilance affects routine medical testing; process controls at nuclear and other types of power plants; production quality control, including that within the pharmaceutical industry; and even long-distance driving and transportation of goods.
In addition to Warm, others involved in this study were:
William Dember, UC professor emeritus of psychology; Gerald Matthews, UC professor of psychology; Raja Parasuraman, professor of psychology at the Catholic University of America; Paula Shear, UC associate professor of psychology; Former UC doctoral psychology students Edward Hitchcock and David Mayleben; Current UC doctoral psychology student Lloyd Tripp. This research and other vigilance projects at UC are sponsored by NASA and the U.S. Air Force.
Other social bookmarking and sharing tools:
The above story is based on materials provided by University Of Cincinnati.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Note: If no author is given, the source is cited instead.