New! Sign up for our free email newsletter.
Science News
from research organizations

Information technology amplifies irrational group behavior

Date:
April 11, 2013
Source:
University of Copenhagen
Summary:
Web tools and social media are our key sources of information when we make decisions as citizens and consumers. But these information technologies can mislead us by magnifying social processes that distort facts and make us act contrary to our own interests. Companies such as Google and Facebook have designed algorithms that are intended to filter away irrelevant information -- known as information selection -- so that we are only served content that fits our clicking history. Researchers say this is, from a democratic perspective, a problem as you may never in your online life encounter views or arguments that contradict your worldview.
Share:
FULL STORY

Web tools and social media are our key sources of information when we make decisions as citizens and consumers. But these information technologies can mislead us by magnifying social processes that distort facts and make us act contrary to our own interests -- such as buying property at wildly inflated prices because we are led to believe that everybody else is. New research from the University of Copenhagen, which has just been published in the journal Metaphilosophy, combines formal philosophy, social psychology, and decision theory to understand and tackle these phenomena.

"Group behaviour that encourages us to make decisions based on false beliefs has always existed. However, with the advent of the internet and social media, this kind of behaviour is more likely to occur than ever, and on a much larger scale, with possibly severe consequences for the democratic institutions underpinning the information societies we live in," says professor of philosophy at the University of Copenhagen Vincent F. Hendricks.

In the article Infostorms just published in the journal Metaphilosophy, he and fellow researchers Pelle G. Hansen and Rasmus Rendsvig analyse a number of social information processes which are enhanced by modern information technology.

Informational cascades and Sex and the City

Curiously, an old book entitled Love Letters of Great Men and Women: From the 18th Century to the Present Day, which in 2007 suddenly climbed the Amazon.com bestseller list, provides a good example of group behaviour set in an online context:

"What generated the huge interest in this long forgotten book was a scene in the movie Sex and the City in which the main character Carrie Bradshaw reads a book entitled Love Letters of Great Men -- which does not exist. So, when fans of the movie searched for this book, Amazon's search engine suggested Love Letters of Great Men and Women instead, which made a lot of people buy a book they did not want. Then Amazon's computers started pairing the book with Sex and the City merchandise, and the old book sold in great numbers," Vincent F. Hendricks points out.

"This is known as an 'informational cascade' in which otherwise rational individuals base their decisions not only on their own private information, but also on the actions of those who act before them. The point is that, in an online context, this can take on massive proportions and result in actions that miss their intended purpose."

Online discussions take place in echo chambers

While buying the wrong book does not have serious consequences for our democratic institutions, it exemplifies, according to professor Vincent F. Hendricks, what may happen when we give our decision-making power to information technologies and processes. And he points to other social phenomena such as 'group polarization' and 'information selection' which do pose threats to democratic discusson when amplified by online media.

"In group polarization, which is well-documented by social psychologists, an entire group may shift to a more radical viewpoint after a discussion even though the individual group members did not subscribe to this view prior to the discussion. This happens for a number of reasons -- one is that group members want to represent themselves in a favourable light in the group by adopting a viewpoint slightly more extreme than the perceived mean. In online forums, this well-known phenomenon is made even more problematic by the fact that discussions take place in settings where group members are fed only the information that fits their worldview, making the discussion forum an echo chamber where group members only hear their own voices," Vincent F. Hendricks suggests.

Companies such as Google and Facebook have designed algorithms that are intended to filter away irrelevant information -- known as information selection -- so that we are only served content that fits our clicking history. According to Professor Hendricks this is, from a democratic perspective, a problem as you may never in your online life encounter views or arguments that contradict your worldview.

"If we value democratic discussion and deliberation, we should apply rigorous analysis, from a variety of disciplines, to the workings of these online social information processes as they become increasingly influential in our information societies."


Story Source:

Materials provided by University of Copenhagen. Note: Content may be edited for style and length.


Journal Reference:

  1. Pelle G. Hansen, Vincent F. Hendricks, Rasmus K. Rendsvig. Infostorms. Metaphilosophy, 2013; 44 (3): 301 DOI: 10.1111/meta.12028

Cite This Page:

University of Copenhagen. "Information technology amplifies irrational group behavior." ScienceDaily. ScienceDaily, 11 April 2013. <www.sciencedaily.com/releases/2013/04/130411124005.htm>.
University of Copenhagen. (2013, April 11). Information technology amplifies irrational group behavior. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2013/04/130411124005.htm
University of Copenhagen. "Information technology amplifies irrational group behavior." ScienceDaily. www.sciencedaily.com/releases/2013/04/130411124005.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES