New! Sign up for our free email newsletter.
Science News
from research organizations

Need an expert? Try the crowd

Date:
August 14, 2012
Source:
University of Vermont
Summary:
Can a crowd be an expert? Apparently, yes. Scientists have created the first-ever crowd-sourced predictive model.
Share:
FULL STORY

"It's potentially a new way to do science."

In 1714, the British government held a contest. They offered a large cash prize to anyone who could solve the vexing "longitude problem" -- how to determine a ship's east/west position on the open ocean -- since none of their naval experts had been able to do so.

Lots of people gave it a try. One of them, a self-educated carpenter named John Harrison, invented the marine chronometer -- a rugged and highly precise clock -- that did the trick. For the first time, sailors could accurately determine their location at sea.

A centuries-old problem was solved. And, arguably, crowdsourcing was born.

Crowdsourcing is basically what it sounds like: posing a question or asking for help from a large group of people. Coined as a term in 2006, crowdsourcing has taken off in the internet era. Think of Wikipedia, and its thousands of unpaid contributors, now vastly larger than the Encyclopedia Britannica.

Crowdsourcing has allowed many problems to be solved that would be impossible for experts alone. Astronomers rely on an army of volunteers to scan for new galaxies. At climateprediction.net, citizens have linked their home computers to yield more than a hundred million hours of climate modeling; it's the world's largest forecasting experiment.

But what if experts didn't simply ask the crowd to donate time or answer questions? What if the crowd was asked to decide what questions to ask in the first place?

Could the crowd itself be the expert?

That's what a team at the University of Vermont decided to explore -- and the answer seems to be yes.

Prediction from the people

Josh Bongard and Paul Hines, professors in UVM's College of Engineering and Mathematical Sciences, and their students, set out to discover if volunteers who visited two different websites could pose, refine, and answer questions of each other -- that could effectively predict the volunteers' body weight and home electricity use.

The experiment, the first of its kind, was a success: the self-directed questions and answers by visitors to the websites led to computer models that effectively predict user's monthly electricity consumption and body mass index.

Their results, "Crowdsourcing Predictors of Behavioral Outcomes," were published in a recent edition of IEEE Transactions: Systems, Man and Cybernetics, a journal of the Institute of Electrical and Electronics Engineers.

"It's proof of concept that a crowd actually can come up with good questions that lead to good hypotheses," says Bongard, an expert on machine science.

In other words, the wisdom of the crowd can be harnessed to determine which variables to study, the UVM project shows -- and at the same time provide a pool of data by responding to the questions they ask of each other.

"The result is a crowdsourced predictive model," the Vermont scientists write.

Unexpected angles

Some of the questions the volunteers posed were obvious. For example, on the website dedicated to exploring body weight, visitors came up with the question: "Do you think of yourself as overweight?" And, no surprise, that proved to be the question with the most power to predict people's body weight.

But some questions posed by the volunteers were less obvious. "We had some eye-openers," Bongard says. "How often do you masturbate a month?" might not be the first question asked by weight-loss experts, but it proved to be the second-most-predictive question of the volunteer's self-reported weights -- more predictive than "how often do you eat during a day?"

"Sometimes the general public has intuition about stuff that experts miss -- there's a long literature on this," Hines says.

"It's those people who are very underweight or very overweight who might have an explanation for why they're at these extremes -- and some of those explanations might not be a simple combination of diet and exercise," says Bongard. "There might be other things that experts missed."

Cause and correlation

The researchers are quick to note that the variables revealed by the evolving Q&A on the experimental websites are simply correlated to outcomes -- body weight and electricity use -- not necessarily the cause.

"We're not arguing that this study is actually predictive of the causes," says Hines, "but improvements to this method may lead in that direction."

Nor do the scientists make claim to being experts on body weight or to be providing recommendations on health or diet (though Hines is an expert on electricity, and the EnergyMinder site he and his students developed for this project has a larger aim to help citizens understand and reduce their household energy use.)

"We're simply investigating the question: could you involve participants in the hypothesis-generation part of the scientific process?" Bongard says. "Our paper is a demonstration of this methodology."

"Going forward, this approach may allow us to involve the public in deciding what it is that is interesting to study," says Hines. "It's potentially a new way to do science."

And there are many reasons why this new approach might be helpful. In addition to forces that experts might simply not know about -- "can we elicit unexpected predictors that an expert would not have come up with sitting in his office?" Hines asks -- experts often have deeply held biases.

Faster discoveries

But the UVM team primarily sees their new approach as potentially helping to accelerate the process of scientific discovery. The need for expert involvement -- in shaping, say, what questions to ask on a survey or what variable to change to optimize an engineering design -- "can become a bottleneck to new insights," the scientists write.

"We're looking for an experimental platform where, instead of waiting to read a journal article every year about what's been learned about obesity," Bongard says, "a research site could be changing and updating new findings constantly as people add their questions and insights."

The goal: "exponential rises," the UVM scientists write, in the discovery of what causes behaviors and patterns -- probably driven by the people who care about them the most. For example, "it might be smokers or people suffering from various diseases," says Bongard. The team thinks this new approach to science could "mirror the exponential growth found in other online collaborative communities," they write.

"We're all problem-solving animals," says Bongard, "so can we exploit that? Instead of just exploiting the cycles of your computer or your ability to say 'yes' or 'no' on a survey -- can we exploit your creative brain?"


Story Source:

Materials provided by University of Vermont. Original written by Joshua E. Brown. Note: Content may be edited for style and length.


Cite This Page:

University of Vermont. "Need an expert? Try the crowd." ScienceDaily. ScienceDaily, 14 August 2012. <www.sciencedaily.com/releases/2012/08/120814213632.htm>.
University of Vermont. (2012, August 14). Need an expert? Try the crowd. ScienceDaily. Retrieved December 10, 2024 from www.sciencedaily.com/releases/2012/08/120814213632.htm
University of Vermont. "Need an expert? Try the crowd." ScienceDaily. www.sciencedaily.com/releases/2012/08/120814213632.htm (accessed December 10, 2024).

Explore More

from ScienceDaily

RELATED STORIES