Aug. 15, 2000 COLUMBUS, Ohio - Researchers at Ohio State University are exploring ways for pilots to monitor automated flight deck systems with their skin instead of just their eyes and ears.
A study has revealed that pilots are more likely to notice changes in the status or behavior of these systems -- and react faster -- when these changes are announced in the form of a vibration instead of a conventional visual indicator.
The same technology may foster human-machine communication in other domains, said Nadine Sarter, an assistant professor in industrial, welding, and systems engineering.
Sarter reported in a recent issue of the journal Human Factors that, depending on workload and phase of flight, pilots detected as many as 40 percent more signals -- sometimes more than twice as rapidly -- when they received vibrations from a small pager-like device worn like a wristwatch.
"We have many senses, and touch is one of the most underutilized, and a very powerful one," Sarter said. "Pilots receive a great deal of visual and audio feedback, but tactile cues are rare."
Sarter's initial work in this area at the University of Illinois at Urbana-Champaign was funded by the Federal Aviation Administration. Now the National Science Foundation funds her continued work at Ohio State, with the idea that her findings may be useful outside aviation as well.
"The same principle could work in an operating room, nuclear power plant, or space shuttle -- any highly automated place where humans must communicate with machines," she said.
Today's computerized aircraft are capable of performing many tasks independently, Sarter explained. An on-board computer could initiate a maneuver or safety procedure when the pilot doesn't expect it. If the pilot doesn't notice, he or she may be surprised later when the change becomes more apparent -- and possibly difficult to recover from.
"More and more, on-board computers present those changes in the form of visual feedback -- words and numbers that appear on numerous displays in the cockpit," Sarter said. "So the pilot has to monitor all the instruments very carefully and combine information from many different displays to get an idea of what the automation is doing."
For her research, Sarter has been using a vibrating device called a "tactor," manufactured by Audiological Engineering Corporation for people with visual or hearing impairments. Like a pager, the tactor vibrates when it receives a signal.
For this most recent Human Factors study, Sarter and former graduate student Aaron Sklar tested the reactions of 21 certified flight instructors, 16 men and five women, who averaged more than 600 hours of flight experience.
The pilots took turns in a flight simulator with one-third of them receiving notification of changes in automation status and behavior through visual cues only. One-third wore a tactor, and received the information only through a vibration on their wrist. The remaining one-third received a vibration at the same time as a visual cue.
Pilots receiving only visual cues noticed when the computer changed status only 83 percent of the time. Pilots in the other two categories noticed changes nearly all of the time.
Pilots wearing tactors were also faster to respond to changes. In one case, they responded in less than half the time -- about two seconds versus five seconds -- than pilots who had to rely on visual cues only. For most other events, pilots wearing the tactors responded about one second sooner.
Sarter and Sklar also discovered some limitations to tactile feedback. For example, pilots sometimes didn't notice vibrations that occurred while the arm wearing the tactor was busy with some other task, such as manipulating aircraft controls. There is also the risk that existing vibrations on the flight deck may interfere with the perception of the tactor feedback, Sarter said.
In collaboration with one of her current graduate students, Mark Nikolic, Sarter has also looked at another powerful sensory channel for capturing attention, that of peripheral vision. In another simulator study to be published in Human Factors, Sarter and Nikolic have shown that this channel may not be asbe as effective as the vibrotactile cues, but it is still far more successful than current focal visual indications of automation status and behavior.
"Ultimately, our goal is to distribute information across various sensoryvarious sensory channels - audio, focal visual, peripheral visual, and tactile - to support task-sharing and adapt to different task and flight contexts more effectively," Sarter said.
Other social bookmarking and sharing tools:
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Note: If no author is given, the source is cited instead.