New! Sign up for our free email newsletter.
Science News
from research organizations

Yale Sonar Robot Modeled After Bat And Dolphin Echolocation Behavior

Date:
October 6, 1997
Source:
Yale University--Office of Public Affairs
Summary:
A robot inspired by the ability of bats and dolphins to use echoes for locating prey is causing robotics experts to reevaluate the relative merits of sound waves versus camera vision for exploring new environments. The sonar device, which was designed and created by Yale University electrical engineering professor Roman Kuc, is so sensitive that it can tell whether a tossed coin has come up heads or tails.
Share:
FULL STORY

New Haven, CT -- A robot inspired by the ability of bats and dolphins touse echoes for locating prey is causing robotics experts to reevaluate therelative merits of sound waves versus camera vision for exploring newenvironments. The sonar device, which was designed and created by YaleUniversity electrical engineering professor Roman Kuc, is so sensitive that itcan tell whether a tossed coin has come up heads or tails.

"In the early days of robot design, primitive navigational sonars wereoften used to locate objects, but cameras were needed to identify them," saysProfessor Kuc, who has designed mobile robots and navigating wheelchairsequipped with ultrasound sensors during 10 years of robotics research. "Inrecent years, scientists have virtually abandoned sonar detection in favor ofcamera vision for robots, but we decided to take a closer look at howecholocation is used in nature to see if we might be missing something."

Advances in camera-vision research have reached a plateau becausescientists have encountered formidable obstacles in duplicating the incrediblepower and subtlety of human vision, Professor Kuc says. Yale's design for sonardetection, on the other hand, could prove easier and less costly than cameravision for identifying an authorized customer at an automated teller machine,detecting production flaws on an assembly line, or helping someone who isparalyzed interact with a computer.

Called Rodolph -- short for robotic dolphin -- Yale's robot is equippedwith three Polaroid electrostatic transducers that can act either astransmitters or receivers to serve as the robot's "mouth" and "ears." Thetransducers are similar to those used in Polaroid autofocus cameras to gauge anobject's range, and in acoustic digital tape measures that use echoes to measuredistances.

Attached to the end of a robotic arm, the transducer in the center emitssounds waves that bounce off objects, much like the high-pitched squeals of batsand the clicking sounds of dolphins. The robot's mouth is flanked by tworotating transducer ears that act as receivers for detecting echoes. The designis inspired by bats, whose ears react by rotating in the direction of an echosource, and by dolphins, who appear to move around in order to place an objectat a standard distance, Professor Kuc explains in the August issue of theJournal of the Acoustical Society of America in the cover article.

The robot's bobbing head with its twitching ears has an eerie,animal-like quality as it scans the environment in Professor Kuc's laboratory,scrutinizing coins and distinguishing between various sizes of rubber 0-ringsand ball bearings. A human hand inadvertently passing through its sonar fieldsets off scanning motions reminiscent of an inquisitive cat.

"The robot exploits the important biological principle of sensormobility to place an object at a constant distance, thus reducing the complexityof object recognition," Professor Kuc says, adding that the rotating ears alsohelp pinpoint and amplify the sound. "Then the robot can either learn a newobject by adding the echoes to its memory, or identify an old object already inits data base."

Controlled by a Pentium 120 processor in a personal computer, the robotemits ultrasound pulses at 60 kilohertz as often as 10 times a second. The earsrotate separately on the moving arm, helping to position the sonar 15centimeters from the object, plus or minus 0.1 millimeter.

Previous sonar robots required a far larger number of sonar readingsfrom different angles and distances for object identification, but ProfessorKuc's robot requires only a single reading because it can move around and scanuntil it arrives at a predetermined distance from the object.

The data the robot gathers during a learning process are logarithmicallycompressed to emphasize slight differences in structure and reduced to producevectors containing 32 different features. Each object is represented byapproximately 50 vectors that form clusters, each cluster corresponding to aparticular view angle, says Professor Kuc, who uses C++ computer language forprocessing.

The next step is to mount the stationary robotic arm on a mobile base toenable Rodolph to explore its environment, says Professor Kuc, whose research issupported by the National Science Foundation.


Story Source:

Materials provided by Yale University--Office of Public Affairs. Note: Content may be edited for style and length.


Cite This Page:

Yale University--Office of Public Affairs. "Yale Sonar Robot Modeled After Bat And Dolphin Echolocation Behavior." ScienceDaily. ScienceDaily, 6 October 1997. <www.sciencedaily.com/releases/1997/10/971006203840.htm>.
Yale University--Office of Public Affairs. (1997, October 6). Yale Sonar Robot Modeled After Bat And Dolphin Echolocation Behavior. ScienceDaily. Retrieved April 22, 2024 from www.sciencedaily.com/releases/1997/10/971006203840.htm
Yale University--Office of Public Affairs. "Yale Sonar Robot Modeled After Bat And Dolphin Echolocation Behavior." ScienceDaily. www.sciencedaily.com/releases/1997/10/971006203840.htm (accessed April 22, 2024).

Explore More

from ScienceDaily

RELATED STORIES