ARLINGTON, Va., -- A 'sonic flashlight' developed by a biomedical engineer at the University of Pittsburgh makes the human body seem translucent right in front of your eyes.
The prototype device merges the visual outer surface of a patient's skin with a live ultrasound scan of what lies beneath. It creates the effect of a translucent ultrasound image floating in its actual 3-D location within the patient, showing blood vessels, muscle tissue, and other internal anatomy.
"In the practice of medicine, the standard method of viewing an image is still to examine a film or screen rather than look directly into the patient," said George Stetten, M.D., Ph.D., assistant professor of bioengineering.
Doctors currently use ultrasound to guide invasive procedures, such as inserting a needle in a vein. But to do so, they must look away from the patient at an ultrasound display screen. This causes a displaced sense of hand-eye coordination.
"The difficulty in mastering these skills has motivated research into developing a more natural way to visually merge ultrasound images with the perceptual real world," Stetten said. His device enables the viewer to look directly at patients and see their internal anatomy.
Previous attempts to fuse medical images with direct vision have been largely unsuccessful, in part because of their complexity. Some have tried using miniature video cameras mounted on a headpiece. Others have used an approach similar to Stetten's but requiring the user to wear an tracking device to determine viewer location.
Stetten has eliminated the need for tracking devices and transmitters by taking full advantage of the way in which a translucent mirror superimposes images from both sides of the glass.
He strategically positions an ultrasound scanner and the ultrasound display on opposite sides of a half-silvered, translucent mirror.(Fig. 1) The viewer looks through the mirror to see the patient and the ultrasound scanner positioned on the patient's skin. At the same time, the ultrasound image is projected on the viewer's side of the mirror in perfect alignment with the corresponding location within the patient's body.
This makes the ultrasound image appear to occupy the same physical space as the body part being imaged. Even if the viewing angle changes, the combined images remain true. The effect relies on precise geometric relationships (Fig. 2) between the ultrasound slice being scanned, the monitor displaying the slice, and the mirror.
"We are actually merging the virtual image in 3D with the interior of the patient," Stetten said. "The reflected image is optically indistinguishable from the corresponding space within the patient."
The result is an image within the natural field of view that can be used to guide invasive procedures, such as taking blood samples without missing the vein, or doing needle biopsies, amniocenteses, catheterizations, surgery, or numerous other procedures while looking directly at the patient instead of at a monitor.
Stetten named the process "tomographic reflection" and the device a "sonic flashlight." He presented the concept and the device at recent meetings of the IEEE and ACM International Symposium on Augmented Reality and the Medical Image Computing and Computer-Assisted Intervention - MICCAI 2001 - conference. The research has also been published in the Journal of Ultrasound in Medicine.
Jules H. Sumkin, M.D., chief of radiology at Magee Women's Hospital and professor of radiology at the University of Pittsburgh, said there is a significant learning curve for radiologists and doctors to look at an ultrasound monitor while performing a procedure on a patient. Once mastered, however, the technique is used routinely.
"Potentially his technology could shorten that curve," said Sumkin, who tried the sonic flashlight for imaging but has not tested it clinically. "It has some significant potential application, but I want to see his next prototype."
Stetten has also built a portable sonic flashlight that could make it easier and more convenient for routine use in a doctor's office. Both the stationary and portable devices will need to be refined and tested in the laboratory before being tested in the clinic.
Stetten's research was performed in collaboration with the Robotics Institute at Carnegie Mellon University. He received a research grant from The Whitaker Foundation in 1994.
Cite This Page: