Science News
from research organizations

Go Fetch! VIMS Submersible Has Anti-terrorism Potential

April 14, 2003
College of William and Mary
Researchers, led by Mark Patterson, associate professor of marine science, at the Virginia Institute of Marine Science have developed an artificial neural network for use with an autonomous underwater vehicle (AUV) named Fetch.

Remote controlled, robot-like submarines could soon be patrolling America’s shores. Sound fishy? Well—not the way you might think.

Fish and their diminishing populations are the inspiration behind an innovative new technology—neural network-driven fish-recognition software—that soon could be at the forefront of homeland security systems.

In the late 1990s Mark Patterson, associate professor of marine sciences at the Virginia Institute of Marine Science (VIMS), and Jim Sias, president of Sias Patterson Incorporated, invented Fetch, an autonomous underwater vehicle (AUV). The mini-robotic submarine travels underwater at depths of up to 1,000 feet on a pre-programmed course. Fetch2, the latest generation of this AUV, is equipped with side-scan sonar. In 2001, Patterson, Zia-ur Rahman (department of applied science at William and Mary) and Roger Mann (VIMS) received a Commerce Department grant (Sea Grant program) to investigate image processing for the data collected by the side-scan sonar. Being able to maintain accurate fish population counts is important in the battle to preserve marine ecosystems, and the numbers provide vital data for governing environmental regulations.

In order to count the fish, marine scientists must see them. This is no easy tasks in the depths of the world’s seas and oceans where visibility is generally low. However, sound waves can be used to detect the existence of objects underwater in the murkiest conditions. The volumes of sonar data can, in turn, be analyzed to reveal properties about the objects—like size, shape and density. With this technology in place, the scientists wanted to go a step further and develop automatic identification and quantification for Fetch2’s computer.

Could the computer analyze data in this way?

“Yes,” said Rahman. Characteristics of different fish species were compiled using the side-scan sonar data. This information was then grouped into test sets used for training artificial neural networks (ANNs). The team combined the use of enhancement algorithms and image processing with the ANNs to “teach” the computer to recognize characteristics of various species. As reported in the Feb 15 edition of New Scientist, the training was successful; scientists were able to have Fetch2 recognize two fish species—jacks and sharks. Fish of other species did not fool the classifier.

Said Patterson, “It’s amazing how well this particular type of neural network works with noisy data. In the future, we hope to expand the classifier’s library to include dozens of species, enabling scientists to perform stock assessments non-destructively—i.e., you won’t need to catch a fish to count it.

“We have only scratched the surface of this technology,” said Rahman. “The computer could be trained to recognize anything—a person swimming, a submarine, a missile or a mine, anything.” Ultimately, the scientists hope to have Fetch2 autonomously follow the objects it detects.

Once programmed to discriminate among underwater objects, Fetch2 could patrol coastlines, harbors, the hulls of vessels, bridge footings and other U.S. vital interests, becoming an important tool in the war on terror and the battle to keep our shores safe.

Story Source:

Materials provided by College of William and Mary. Note: Content may be edited for style and length.

Cite This Page:

College of William and Mary. "Go Fetch! VIMS Submersible Has Anti-terrorism Potential." ScienceDaily. ScienceDaily, 14 April 2003. <>.
College of William and Mary. (2003, April 14). Go Fetch! VIMS Submersible Has Anti-terrorism Potential. ScienceDaily. Retrieved May 22, 2017 from
College of William and Mary. "Go Fetch! VIMS Submersible Has Anti-terrorism Potential." ScienceDaily. (accessed May 22, 2017).