A new computer vision system for automated analysis of animal movement — honey bee activities, in particular — is expected to accelerate animal behavior research, which also has implications for biologically inspired design of robots and computers.
The animal movement analysis system is part of the BioTracking Project, an effort conducted by Georgia Institute of Technology robotics researchers led by Tucker Balch, an assistant professor of computing.
"We believe the language of behavior is common between robots and animals," Balch said. "That means, potentially, that we could videotape ants for a long period of time, learn their 'program' and run it on a robot."
Social insects, such as ants and bees, represent the existence of successful large-scale, robust behavior forged from the interaction of many, simple individuals, Balch explained. Such behavior can offer ideas on how to organize a cooperating colony of robots capable of complex operations.
To expedite the understanding of such behavior, Balch's team developed a computer vision system that automates analysis of animal movement — once an arduous and time-consuming task. Researchers are using the system to analyze data on the sequential movements that encode information — for example in bees, the location of distant food sources, Balch said. He will present the research at the Second International Workshop on the Mathematics and Algorithms of Social Insects on Dec. 16-17 at Georgia Tech.
With an 81.5 percent accuracy rate, the system can automatically analyze bee movements and label them based on examples provided by human experts. This level of labeling accuracy is high enough to allow researchers to build a subsequent system to accurately determine the behavior of a bee from its sequence of motions, Balch explained.
For example, one sequence of motions bees commonly perform are waggle dances consisting of arcing to the right, waggling (walking in a generally straight line while oscillating left and right), arcing to the left, waggling and so on. These motions encode the locations of distant food sources, according to Cornell University Professor of Biology Thomas Seeley, who has collaborated with Balch on this project. Balch is also working with Professor Deborah Gordon of Stanford University on related work with ants.
Balch's animal movement analysis system has several components. First, researchers shoot 15 minutes of videotape of bees — some of which are marked with a bright-colored paint and returned to an observation hive. Then computer vision-based tracking software converts the video of the marked bees into x- and y-coordinate location information for each animal in each frame of the footage. Some segments of this data are hand labeled by a researcher and then used as motion examples for the automated analysis system.
In future work, Balch and his colleagues will build a system that can learn executable models of these behaviors and then run the models in simulation. These simulations, Balch explained, would reveal the accuracy of the models. Researchers don't yet know if these models will yield better computer programming algorithms, though they are hopeful based on what previous research has revealed.
"Computer scientists have borrowed some of the algorithms discovered by biologists working with insects to challenging problems in computing," Balch said. "One example is network routing, which dictates the path data takes across the Internet. In this case the insect-based network routing algorithm, investigated by Marco Dorigo, is the best solution to date."
But challenges lie ahead for researchers. They will have to grapple with differences between the motor and sensory capabilities of robots and insects, Balch added.
Balch's research team members include graduate student Adam Feldman, Assistant Professor of Computing Frank Dellaert and researcher Zia Khan. More information about their project is available at borg.cc.gatech.edu/biotracking. The project is funded by a grant from the National Science Foundation.
In related research with Professor Kim Wallen at Emory University's Yerkes National Primate Research Center, Balch and Khan are also observing monkeys with a similar computer vision system. They hope these studies will yield behavior models that can be implemented in computer code.
The research team is learning about the spatial memory of and social interaction among monkeys. Already, they can track the movements of individual monkeys as they search for and find hidden treats in a large enclosure. Later, they want to observe a troop of 60 to 80 monkeys living together in a larger compound.
So far, researchers have learned that male and female monkeys have different spatial memories. Males apparently remember the physical distance to food, while females follow landmarks to find treats, Balch says.
"We're involved to measure precisely where the monkeys go and how long it takes them to find the food," Balch explains. "We use the information from experiments to test hypotheses on spatial memory. We're more interested in the social systems among these animals. But we need this basic capability to track monkeys in 3D. So this work is a first step in this direction."
Ultimately, Balch and his colleagues in the Georgia Tech College of Computing's "Borg Lab" — named after the Borg of "Star Trek" fame — want to use this animal behavior information to design robots that work effectively with people in dynamic, noisy and unknown environments such as those faced by military and law enforcement officials.
Balch will present his research team's findings on the bee movement system at the December workshop to prominent biologists, mathematicians, engineers and computer scientists who gather to share ideas about mathematical and algorithmic models of social insect behavior. Balch organized the workshop with Carl Anderson, a visiting assistant professor of natural systems in the Georgia Tech School of Industrial and Systems Engineering. For more information on the workshop, see http://www.insects.gatech.edu.
Cite This Page: