Oct. 9, 2002 Computer scientists at OHSU's OGI School of Science and Engineering have stripped down a radio-controlled monster truck and turned it into a robotic vehicle that's helping them perfect the latest generation of computer software. The "Timbot" is equipped with its own computer system, a video camera, and other sensors. On a very basic level, it can "decide" where it needs to go, using its sensors and an onboard computer, and, at the same time, transmit live video images across the Internet so that a remote viewer can see what Timbot is looking at. The Timbot is one part of Project Timber (Time as a basis for embedded real-time systems) in the Pacific Software Research Center. The project is funded by the Defense Advanced Research Projects Administration (DARPA).
While it may look like a toy on the outside, Timbot is actually a tool to test original software developed at the School of Science and Engineering that may one day guide unmanned robotic vehicles such as cars, buses or even aircraft. The software is called "embedded software" because it is embedded inside of another system (in this case a robotic monster truck) and serves the specific needs of that system (in this case guiding the vehicle's movement).
"Actually, embedded systems are all around us," says Mark Jones, DPhil, Associate Professor of Computer Science and Engineering at OHSU. "Many devices--from household appliances to medical devices, bank ATMs, automobiles, and aircraft--have computers embedded in them. Nobody likes having to reboot a desktop computer that has crashed, but the consequences of a bug in the software that controls embedded systems like these could be much more serious."
The Timber project is developing new software that will make it easier to construct more reliable embedded systems, and is using Timbot to demonstrate how these technologies might be used in practice. "Think of what happens when you give someone driving directions," says Jones. "I can tell you how to drive from downtown Portland to the OHSU West Campus, but I won't need to tell you how to operate your vehicle, or to remind you to stop at red lights or purchase gas if you need it. In the same way, Timbot will operate in a largely autonomous manner, responding to high-level instructions about where it should go, but taking care of the details by itself."
Timbot is helping its creators develop systems that support "graceful degradation", adapting flexibly and dynamically to changes in their environment instead of failing outright under system overload. When Timbot encounters challenging conditions – such as an obstacle in its path – that require extra computational power, it will automatically cut back on some of its other functions instead of freezing up or crashing. "For example, if the Timbot needs to perform expensive calculations to ensure that it avoids an obstacle, then it can slow down and reduce the amount of time that it spends processing video. Once it is past the obstacle, Timbot can reallocate its resources, increasing the quality of the video images that it transmits, and moving faster again."
Timbot is in its early stages of development. While it is capable of some simple navigation by itself, for the most part Jones and his research associates now control it through a wireless link to a laptop computer. But, even at this early stage, the research team is learning about the software challenges of getting Timbot's video, sonar, steering and locomotion systems to work together.
More information about the Timber project, and a Quicktime video of the Timbot, is available at http://www.cse.ogi.edu/~mpj/timbot/index.html.
Other social bookmarking and sharing tools:
The above story is reprinted from materials provided by Oregon Health & Science University.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Note: If no author is given, the source is cited instead.