New! Sign up for our free email newsletter.
Science News
from research organizations

Shape-shifting robots perceive surroundings, make decisions for first time

Date:
November 1, 2018
Source:
Cornell University
Summary:
Researchers have developed modular robots that can perceive their surroundings, make decisions and autonomously assume different shapes in order to perform various tasks -- an accomplishment that brings the vision of adaptive, multipurpose robots a step closer to reality.
Share:
FULL STORY

General-purpose robots have plenty of limitations. They can be expensive and cumbersome. They often accomplish only a single type of task.

But modular robots -- composed of several interchangeable parts, or modules -- are far more flexible. If one part breaks, it can be removed and replaced. Components can be rearranged as needed -- or better yet, the robots can figure out how to reconfigure themselves, based on the tasks they're assigned and the environments they're navigating.

Now, a Cornell University-led team has developed modular robots that can perceive their surroundings, make decisions and autonomously assume different shapes in order to perform various tasks -- an accomplishment that brings the vision of adaptive, multipurpose robots a step closer to reality.

"This is the first time modular robots have been demonstrated with autonomous reconfiguration and behavior that is perception-driven," said Hadas Kress-Gazit, associate professor of mechanical and aerospace engineering at Cornell and principal investigator on the project.

The results of this research were published in Science Robotics.

The robots are composed of wheeled, cube-shaped modules that can detach and reattach to form new shapes with different capabilities. The modules have magnets to attach to each other, and Wi-Fi to communicate with a centralized system.

Other modular robot systems have successfully performed specific tasks in controlled environments, but these robots are the first to demonstrate fully autonomous behavior and reconfigurations based on the task and an unfamiliar environment, Kress-Gazit said.

"I want to tell the robot what it should be doing, what its goals are, but not how it should be doing it," she said. "I don't actually prescribe, 'Move to the left, change your shape.' All these decisions are made autonomously by the robot."

The work was funded by the National Science Foundation.


Story Source:

Materials provided by Cornell University. Original written by Melanie Lefkowitz. Note: Content may be edited for style and length.


Journal Reference:

  1. Jonathan Daudelin, Gangyuan Jing, Tarik Tosun, Mark Yim, Hadas Kress-Gazit, Mark Campbell. An integrated system for perception-driven autonomy with modular robots. Science Robotics, 2018; 3 (23): eaat4983 DOI: 10.1126/scirobotics.aat4983

Cite This Page:

Cornell University. "Shape-shifting robots perceive surroundings, make decisions for first time." ScienceDaily. ScienceDaily, 1 November 2018. <www.sciencedaily.com/releases/2018/11/181101085243.htm>.
Cornell University. (2018, November 1). Shape-shifting robots perceive surroundings, make decisions for first time. ScienceDaily. Retrieved March 27, 2024 from www.sciencedaily.com/releases/2018/11/181101085243.htm
Cornell University. "Shape-shifting robots perceive surroundings, make decisions for first time." ScienceDaily. www.sciencedaily.com/releases/2018/11/181101085243.htm (accessed March 27, 2024).

Explore More

from ScienceDaily

RELATED STORIES