Data publishers should have the option of submitting their biodiversity datasets for peer review, according to a discussion paper commissioned by GBIF.
The proposal is among a set of recommendations made by Mark Costello and co-authors in the paper "Quality assurance and Intellectual Property Rights in advancing biodiversity data publication," freely available for download through the GBIF Online Resource Centre.
The paper argues that concerns over data quality impede the use of large biodiversity databases by researchers, and subsequent benefits to society. Peer review is proposed as the highest standard of a scale of quality assurance that could be attached to datasets published through online networks such as GBIF (http://www.gbif.org/orc/?doc_id=5016).
"Peer review is the standard mechanism used to distinguish the quality of scientific publications. Here, we argue that the next step in data publication is to include the option of peer review," the authors write.
"Data publication can be similar to the conventional publication of articles in journals that includes online submission, quality checks, peer-review, and editorial decisions.
"This quality-assurance process will at least assess, and potentially could improve the accuracy of the data, which in turn reduces the need for users to 'clean' the data, and thus increases data use while the authors and/or editors get due credit for a peer-reviewed (data) publication."
The proposal complements existing procedures, which enable authors of metadata documents describing datasets published through GBIF, to submit 'data papers' to online journals. Experience has found that referees of these papers in practice review the data as well as the metadata, thus adding a layer of quality control.
The paper also recommends adoption of community-wide standards related to data citation, accessibility, metadata, and quality control to enable easier integration of data across datasets.
Cite This Page: