Computational Sensors Could Steer a Car or Guide Surgical Tools
A Johns Hopkins University electrical engineer has developed a new robotic vision system on a microchip that enables a toy car to follow a line around a test track, avoiding obstacles along the way. In the near future, the same technology may allow a robotic surgical tool to locate and operate on a clogged artery in a beating human heart.
The key to this system, says Ralph Etienne-Cummings, is a single chip that combines several critical functions: It performs analog and digital processing, extracts relevant information, makes decisions and communicates them to the robot. If the system in the toy car "sees" an obstacle ahead, it directs the vehicle to move around it. If the chip is used in a surveillance system, the camera can follow a moving target.
Because the decision making is done on the microchip itself, not on a separate computer, the response time is much faster than other robotic vision systems, the researcher says. Also, he says, this system is much smaller, uses less power and can be mounted on mobile machines, including law-enforcement microrobots, autonomous flying machines and extra-terrestrial rovers.
"The idea of putting electronic sensing and processing in the same place is called computational sensing," explains Etienne-Cummings, an assistant professor of electrical and computer engineering at Johns Hopkins. "It was coined less than 10 years ago by the people who started this new line of research. Our goal is to revolutionize robotic vision or robotics in general. It hasn't happened yet, but we're making progress."
In a paper presented at the Conference on Intelligent Robots and Systems in Victoria, British Columbia, Etienne-Cummings outlined the advantages of this technology and described his success in using it in a toy car that maneuvered around a track without help from a human controller. The project marked one of the first times a biologically inspired computational visual sensor has been used to guide a robotic vehicle around obstacles as it followed a simulated road.
The technology used in this test has many potential applications, the engineer believes. Beyond their role in autonomous navigation and medical systems, computational sensors could allow robots to identify and pick up parts in manufacturing plants. In a video-conferencing system, a computational sensor could enable the camera to "lock on" to a speaker who wished to move around the room, the researcher says.
By processing and reacting to light as soon as it hits the system, these sensors take a cue from Mother Nature. "This resembles the early type of processing that takes place in a rabbit's eye or a frog's," Etienne-Cummings says. "The animal sees a shape moving up ahead. If the shape is small enough, it may be food, so the animal moves toward it. But if it's too large, it might be a threat to its safety, so the animal runs away. These reactions happen very, very quickly, in the earliest moments of biological processing."
When he designs computational sensors, the Johns Hopkins researcher is not trying make electronic versions of biological cells and brain tissue. "I'm just trying to mimic their function," he says, "using the best electronic tools I can find."
In the toy car that Etienne-Cummings adapted, two sensors are mounted as "eyes" on the front of the vehicle. The microchips force the car to follow a line detected by the sensors, unless an obstacle appears in its path. To the chips, avoiding a crash takes priority over following the line, so they steer the car away from the obstacle. The system also "remembers" how it turned to avoid the obstacle so that it can steer the car back to the line to resume its original course.
Etienne-Cummings has also begun working with Johns Hopkins biomedical engineering researchers who are creating computer models of the heart. He hopes to use computational sensor technology to enable a robot arm to keep pace with a beating heart. If this technology is perfected, surgeons of the future may be able to use the robot to clear a blocked cardiac artery without having to stop the heart first, as doctors must do today.
To achieve such advances, closer collaboration between microchip designers and the mechanical engineers who build robots is essential, Etienne-Cummings says. "The people who assemble robots don't have access to the sensors that I design," he says. "I think that's one of the things that has prevented computational sensors from making greater inroads in robotics. We are two divided communities, and we haven't been talking to each other. Now, we're finally starting to have those conversations."
Etienne-Cummings, who was born in the islands of Seychelles off the coast of Africa, earned his doctorate in electrical engineering at the University of Pennsylvania. He joined the faculty of Johns Hopkins' Whiting School of Engineering in the summer of 1998.
Color slides of Ralph Etienne-Cummings and his system available; contact Phil Sneiderman at address or phone number above.
The above post is reprinted from materials provided by Johns Hopkins University. Note: Materials may be edited for content and length.
Cite This Page: