Featured Research

from universities, journals, and other organizations

Scientists 'bad at judging peers' published work,' says new study

Date:
October 8, 2013
Source:
Public Library of Science
Summary:
Are scientists any good at judging the importance of the scientific work of others? According to a new study scientists are unreliable judges of the importance of fellow researchers' published papers.

Are scientists any good at judging the importance of the scientific work of others? According to a study published 8 October in the open access journal PLOS Biology (with an accompanying editorial), scientists are unreliable judges of the importance of fellow researchers' published papers.

Related Articles


The article's lead author, Professor Adam Eyre-Walker of the University of Sussex, says: "Scientists are probably the best judges of science, but they are pretty bad at it."

Prof. Eyre-Walker and Dr Nina Stoletzki studied three methods of assessing published scientific papers, using two sets of peer-reviewed articles. The three assessment methods the researchers looked at were:

  • Peer review: subjective post-publication peer review where other scientists give their opinion of a published work;
  • Number of citations: the number of times a paper is referenced as a recognised source of information in another publication;
  • Impact factor: a measure of a journal's importance, determined by the average number of times papers in a journal are cited by other scientific papers.

The findings, say the authors, show that scientists are unreliable judges of the importance of a scientific publication: they rarely agree on the importance of a particular paper and are strongly influenced by where the paper is published, over-rating science published in high-profile scientific journals. Furthermore, the authors show that the number of times a paper is subsequently referred to by other scientists bears little relation to the underlying merit of the science.

As Eyre-Walker puts it: "The three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased and expensive method by which to assess merit. While the impact factor may be the most satisfactory of the methods considered, since it is a form of prepublication review, it is likely to be a poor measure of merit, since it depends on subjective assessment."

The authors argue that the study's findings could have major implications for any future assessment of scientific output, such as currently being carried out for the UK Government's forthcoming Research Excellence Framework (REF). Eyre-Walker adds: "The quality of the assessments generated during the REF is likely to be very poor, and calls into question whether the REF in its current format is a suitable method to assess scientific output."

PLOS Biology is also publishing an accompanying Editorial by Dr Jonathan Eisen of the University of California, Davis, and Drs Catriona MacCallum and Cameron Neylon from the Advocacy department of the open access organization the Public Library of Science (PLOS).

These authors welcome Eyre-Walker and Stoletski's study as being "among the first to provide a quantitative assessment of the reliability of evaluating research," and encourage scientists and other to read it. They also support their call for openness in research assessment processes. However, they caution that assessment of merit is intrinsically a complex and subjective process, with "merit" itself meaning different things to different people, and point out that Eyre-Walker and Stoletski's study "purposely avoids defining what merit is."

Dr Eisen and co-authors also tackle the suggestion that the impact factor is the "least bad" form of assessment, recommending the use of multiple metrics that appraise the article rather than the journal ("a suite of article level metrics"), an approach that PLOS has been pioneering. Such metrics might include "number of views, researcher bookmarking, social media discussions, mentions in the popular press, or the actual outcomes of the work (e.g. for practice and policy)."


Story Source:

The above story is based on materials provided by Public Library of Science. Note: Materials may be edited for content and length.


Journal Reference:

  1. Adam Eyre-Walker, Nina Stoletzki. The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations. PLoS Biology, 2013; 11 (10): e1001675 DOI: 10.1371/journal.pbio.1001675

Cite This Page:

Public Library of Science. "Scientists 'bad at judging peers' published work,' says new study." ScienceDaily. ScienceDaily, 8 October 2013. <www.sciencedaily.com/releases/2013/10/131008182345.htm>.
Public Library of Science. (2013, October 8). Scientists 'bad at judging peers' published work,' says new study. ScienceDaily. Retrieved November 28, 2014 from www.sciencedaily.com/releases/2013/10/131008182345.htm
Public Library of Science. "Scientists 'bad at judging peers' published work,' says new study." ScienceDaily. www.sciencedaily.com/releases/2013/10/131008182345.htm (accessed November 28, 2014).

Share This


More From ScienceDaily



More Science & Society News

Friday, November 28, 2014

Featured Research

from universities, journals, and other organizations


Featured Videos

from AP, Reuters, AFP, and other news services

Ebola Leaves Orphans Alone in Sierra Leone

Ebola Leaves Orphans Alone in Sierra Leone

AFP (Nov. 27, 2014) The Ebola epidemic sweeping Sierra Leone is having a profound effect on the country's children, many of whom have been left without any family members to support them. Duration: 01:02 Video provided by AFP
Powered by NewsLook.com
EU Pushes Google For Worldwide Right To Be Forgotten

EU Pushes Google For Worldwide Right To Be Forgotten

Newsy (Nov. 27, 2014) Privacy regulators recommend Google expand its requested removals to apply to all its web domains. Video provided by Newsy
Powered by NewsLook.com
Who Will Failed Nuclear Talks Hurt Most?

Who Will Failed Nuclear Talks Hurt Most?

Reuters - Business Video Online (Nov. 25, 2014) With no immediate prospect of sanctions relief for Iran, and no solid progress in negotiations with the West over the country's nuclear programme, Ciara Lee asks why talks have still not produced results and what a resolution would mean for both parties. Video provided by Reuters
Powered by NewsLook.com
FCC Forces T-Mobile To Alert Customers Of Data Throttling

FCC Forces T-Mobile To Alert Customers Of Data Throttling

Newsy (Nov. 25, 2014) T-Mobile and the FCC have reached an agreement requiring the company to alert customers when it throttles their data speeds. Video provided by Newsy
Powered by NewsLook.com

Search ScienceDaily

Number of stories in archives: 140,361

Find with keyword(s):
Enter a keyword or phrase to search ScienceDaily for related topics and research stories.

Save/Print:
Share:

Breaking News:

Strange & Offbeat Stories


Science & Society

Business & Industry

Education & Learning

In Other News

... from NewsDaily.com

Science News

Health News

Environment News

Technology News



Save/Print:
Share:

Free Subscriptions


Get the latest science news with ScienceDaily's free email newsletters, updated daily and weekly. Or view hourly updated newsfeeds in your RSS reader:

Get Social & Mobile


Keep up to date with the latest news from ScienceDaily via social networks and mobile apps:

Have Feedback?


Tell us what you think of ScienceDaily -- we welcome both positive and negative comments. Have any problems using the site? Questions?
Mobile: iPhone Android Web
Follow: Facebook Twitter Google+
Subscribe: RSS Feeds Email Newsletters
Latest Headlines Health & Medicine Mind & Brain Space & Time Matter & Energy Computers & Math Plants & Animals Earth & Climate Fossils & Ruins