New! Sign up for our free email newsletter.
Science News
from research organizations

Robot Fetches Objects With Just A Point And A Click

Date:
March 20, 2008
Source:
Georgia Institute of Technology
Summary:
Researchers have created a robot, designed to help users with limited mobility with everyday tasks, that moves autonomously to an item selected with a green laser pointer, picks up the item and then delivers it to the user, another person or a selected location such as a table. The new robotic communication method may help robots find their way into the home sooner.
Share:
FULL STORY

Robots are fluent in their native language of 1 and 0 absolutes but struggle to grasp the nuances and imprecise nature of human language. While scientists are making slow, incremental progress in their quest to create a robot that responds to speech, gestures and body language, a more straightforward method of communication may help robots find their way into homes sooner.

A team of researchers led by Charlie Kemp, director of the Center for Healthcare Robotics in the Health Systems Institute at the Georgia Institute of Technology and Emory University, have found a way to instruct a robot to find and deliver an item it may have never seen before using a more direct manner of communication — a laser pointer.

El-E (pronounced like the name Ellie), a robot designed to help users with limited mobility with everyday tasks, autonomously moves to an item selected with a green laser pointer, picks up the item and then delivers it to the user, another person or a selected location such as a table. El-E, named for her ability to elevate her arm and for the arm’s resemblance to an elephant trunk, can grasp and deliver several types of household items including towels, pill bottles and telephones from floors or tables.

To ensure that El-E will someday be ready to roll out of the lab and into the homes of patients who need assistance, the Georgia Tech and Emory research team includes Prof. Julie Jacko, an expert on human-computer interaction and assistive technologies, and Dr. Jonathan Glass, director of the Emory ALS Center at the Emory University School of Medicine. El-E’s creators are gathering input from ALS (also known as Lou Gehrig’s disease) patients and doctors to prepare El-E to assist patients with severe mobility challenges.

The research was presented at the ACM/IEEE International Conference on Human-Robot Interaction in Amsterdam on March 14 and an associated workshop on “Robotic Helpers” on March 12.

Charlie Kemp, director of the Healthcare Robotics Center at Georgia Tech and Emory University, accepts a towel from El-E, a robot designed to aid users with mobility impairment with everyday tasks.

The verbal instructions a person gives to help someone find a desired object are very difficult for a robot to use (the cup over near the couch or the brush next to the red toothbrush). These types of commands require the robot to understand everyday human language and the objects it describes at a level well beyond the state of the art in language recognition and object perception.

“We humans naturally point at things but we aren’t very accurate, so we use the context of the situation or verbal cues to clarify which object is important,” said Kemp, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory. “Robots have some ability to retrieve specific, predefined objects, such as a soda can, but retrieving generic everyday objects has been a challenge for robots.”

The laser pointer interface and methods developed by Kemp’s team overcome this challenge by providing a direct way for people to communicate the location of interest to El-E and complimentary methods that enable El-E to pick up an object found at this location. Through these innovations, El-E can retrieve objects without understanding what the object is or what it’s called.

In addition to the laser pointer interface, El-E uses another approach to simplify its task. Indoors, objects are usually found on smooth, flat surfaces with uniform appearance, such as floors, tables, and shelves. Kemp’s team designed El-E to take advantage of this common structure.

Regardless of the height, El-E uses the same strategies to localize and pick up the object by elevating its arm and sensors to match the height of the object’s location. The robot’s ability to reach objects both from the floor and shelves is particularly important for patients with mobility impairments since these locations can be difficult to reach, Kemp said.

El-E uses a custom-built camera that is omni-directional to see most of the room. After the robot detects that a selection has been made with the laser pointer, the robot moves two cameras to look at the laser spot and triangulate its position in three-dimensional space.

Next, the robot estimates where the item is in relation to its body and travels to the location. If the location is above the floor, the robot finds the edge of the surface on which the object is sitting, such as the edge of a table.

Picking up the unknown object is a significant challenge El-E faces in completing its task. It uses a laser range finder that scans across the surface to initially locate the object. Then, after moving its hand above the object, it uses a camera in its hand to visually distinguish the object from the texture of the floor or table. After refining the hand’s position and orientation, it descends upon the object while using sensors in its hand to decide when to stop moving down and start closing its gripper. Finally, it closes its gripper upon the object until it has a secure grip.

Once the robot has picked up the item, the laser pointer can be used to guide the robot to another location to deposit the item or direct the robot to take the item to a person. El-E distinguishes between these two situations by looking for a face near the selected location.

If the robot detects a face, it carefully moves toward the person and presents the item to the user so it can be taken. It uses the location of the face and legs to determine where it will present the object.

If no face is detected near the location illuminated by the laser pointer, the robot decides whether the location is on a table or the floor. If it is on a table, El-E places the object on the table. If the location is on the floor El-E moves to the selected location on the floor.

After delivering the item, the robot returns to the user’s side, ready to handle the next request.

El-E’s power and computation is all on board (no tethers or hidden computers in the next room) and runs Ubuntu Linux on a Mac mini.

El-E’s laser pointer interface and methods for autonomous mobile manipulation represent an important step toward robotic assistants in the home.

“If you want a robot to cook a meal or brush your hair, you will probably want the robot to first fetch the items it will need, and for tasks such as cleaning up around the home, it is essential that the robot be able to pick up objects and move them to new locations. We see object fetching as a core capability for future robots in healthcare settings, such as the home,” Kemp said.

The Georgia Tech and Emory research team is now working to help El-E expand its capabilities to include switching lights on and off when the user selects a light switch and opening and closing doors when the user selects a door knob.


Story Source:

Materials provided by Georgia Institute of Technology. Note: Content may be edited for style and length.


Cite This Page:

Georgia Institute of Technology. "Robot Fetches Objects With Just A Point And A Click." ScienceDaily. ScienceDaily, 20 March 2008. <www.sciencedaily.com/releases/2008/03/080319160057.htm>.
Georgia Institute of Technology. (2008, March 20). Robot Fetches Objects With Just A Point And A Click. ScienceDaily. Retrieved December 5, 2024 from www.sciencedaily.com/releases/2008/03/080319160057.htm
Georgia Institute of Technology. "Robot Fetches Objects With Just A Point And A Click." ScienceDaily. www.sciencedaily.com/releases/2008/03/080319160057.htm (accessed December 5, 2024).

Explore More

from ScienceDaily

RELATED STORIES