New! Sign up for our free email newsletter.
Science News
from research organizations

Penn State Engineers Boost Tracking Ability Of Robotic "Eyes"

Date:
July 6, 1999
Source:
Penn State
Summary:
By coupling the latest computer vision and control techniques, Penn State engineers have developed a model robotic system with enhanced ability to track moving targets in real world environments.
Share:
FULL STORY

University Park, Pa --- By coupling the latest computer vision and control techniques, Penn State engineers have developed a model robotic system with enhanced ability to track moving targets in real world environments.

Dr. Octavia I. Camps, associate professor of electrical engineering and computer engineering, and Dr. Mario Sznaier, associate professor of electrical engineering, say their approach opens up the possibility of applying active vision to a broader range of real world problems, including Intelligent Vehicle Highway Systems, robot-assisted surgery, 3D reconstruction, inspection, vision-assisted grasping, MEMS microassembly, automated spacecraft docking and surveillance.

Their approach and model system is detailed in a paper, "Active Vision: A New Challenge for Robust Control Theory" presented Thursday, July 1, in Hong Kong at the International Workshop on the Control of Uncertain Systems organized by the International Federation of Automatic Control.

The authors note that in all of their proposed applications, incorporating a feedback feature offers the possibility of achieving acceptable performance even given poorly calibrated cameras, blurring or only partially visible targets.

Camps says, "For example, instead of learning to identify a whole object, the computer can be trained to identify an object by its significant parts. So, if another object is in front of it or the camera is out of focus, the computer can still recognize it."

Using appearance-based object recognition techniques, the researchers have also been able to speed up the rate at which the computer program recognizes the target to about one image every 33 milliseconds.

By bringing control theory tools that have emerged in the last five years to bear on these problems, along with recent advances in hardware, we've been able to optimize the system using off-the-shelf hardware from a variety of manufacturers," says Sznaier.

He adds that using modern control tools, such as robust identification and multi-objective robust control, also takes the guesswork out of control design.

The Penn State researchers note that to fully exploit the capabilities of newly available hardware, the control and computer vision aspects of the problem must be addressed together. When they are, the synthesis offers vision systems capable of moving beyond carefully controlled environments.

Dr. Camps is a specialist in computer vision and Dr. Sznaier focuses on control. Their research is supported in part by grants from the National Science Foundation, the National Aeronautics and Space Administration and the Air Force Office of Scientific Research.


Story Source:

Materials provided by Penn State. Note: Content may be edited for style and length.


Cite This Page:

Penn State. "Penn State Engineers Boost Tracking Ability Of Robotic "Eyes"." ScienceDaily. ScienceDaily, 6 July 1999. <www.sciencedaily.com/releases/1999/07/990706071019.htm>.
Penn State. (1999, July 6). Penn State Engineers Boost Tracking Ability Of Robotic "Eyes". ScienceDaily. Retrieved March 27, 2024 from www.sciencedaily.com/releases/1999/07/990706071019.htm
Penn State. "Penn State Engineers Boost Tracking Ability Of Robotic "Eyes"." ScienceDaily. www.sciencedaily.com/releases/1999/07/990706071019.htm (accessed March 27, 2024).

Explore More

from ScienceDaily

RELATED STORIES