New! Sign up for our free email newsletter.
Science News
from research organizations

Outcome data in clinical trials reported inadequately, inconsistently, study finds

Date:
May 14, 2014
Source:
Elsevier Health Sciences
Summary:
There is increasing public pressure to report the results of all clinical trials to eliminate publication bias and improve public access. However, investigators building a database of clinical trials involving chronic pain have encountered several challenges. They describe the perils in a new article, and propose alternative strategies to improve clinical trials reporting.
Share:
FULL STORY

There is increasing public pressure to report the results of all clinical trials to eliminate publication bias and improve public access. However, investigators using the World Health Organization's International Clinical Trials Registry Platform (ICTRP) to build a database of clinical trials involving chronic pain have encountered several challenges. They describe the perils and pitfalls of using the ICTRP and propose alternative strategies to improve clinical trials reporting. This important and insightful study is published in the August issue of the journal PAIN®.

U.S. law already requires posting summarized results on ClinicalTrials.gov, a service of the National Institutes of Health, within one year of study completion for certain categories of industry-sponsored trials. Legislation mandating data publication within one year of study completion, irrespective of outcome, is under consideration in the European Union. Yet compliance with the U.S. law is poor.

"Although clinical trial registries facilitate public access to basic trial information, we found that access to unbiased trial results is still inadequate. A distressingly large number of trials have no published results at all or are mentioned only in sponsor press releases. Recent analyses have found that only 25-35% of clinical trials required to post study results on ClinicalTrials.gov actually do so," comments senior investigator Michael C. Rowbotham, MD, scientific director of the California Pacific Medical Center Research Institute in San Francisco.

The investigators drew on their experience with the Repository of Registered Analgesic Clinical Trials (RReACT) database, a scorecard for analgesic clinical trials for chronic pain (sponsored by an FDA grant to the University of Rochester), to describe the challenge of constructing a global open-access database of clinical trials and trial results. They focused on three frequently studied chronic pain disorders: post-herpetic neuralgia, fibromyalgia, and painful diabetic peripheral neuropathy. The initial build of RReACT was limited to randomized trials registered on ClinicalTrials.gov with a primary (or key secondary) outcome measure assessing analgesic drug efficacy. The database was then expanded to report on all of the primary registries in the ICTRP, and investigators analyzed trial registration, registry functionality, and cross-registry harmonization, using a comprehensive search algorithm to find trial results in the peer-reviewed literature and grey literature. A total of 447 unique trials were identified, with 86 trials listed on more than one registry.

The ICTRP provides a single search portal to 15 primary registries, including ClinicalTrials.gov. ICTRP primary registries follow International Committee of Medical Journal Editors (ICMJE) guidelines and must have a national or regional focus, government support, nonprofit management, free public access, and an unambiguous trial identification method.

ClinicalTrials.gov is the largest ICTRP database, with more than 152,000 trials globally. The EU Clinical Trials Register (EU-CTR) is the second largest, with more than 21,000 trials. Current Controlled Trials (more than 11,000 trials), the oldest global registry, is hosted by BioMed Central (part of Springer Science + Business Media, a for-profit scientific publisher specializing in open-access journals). Five national registries each contain fewer than 1,000 trials. All ICTRP registries provide information about study design (i.e., randomization, blinding, control groups, inclusion/exclusion criteria, and outcome measures) and current study status. Not all ICTRP registries track study changes, list additional study identifiers, or provide links to publications.

"We identified several perils and pitfalls of using the ICTRP," says Dr. Rowbotham. "Manual searches are necessary, as ICTRP does not reliably identify trials listed on multiple registries. Searching ICTRP as a whole yields different results from searching registries individually. Outcome measure descriptions for multiply-registered trials vary between registries. Registry-publication pairings are often inaccurate or incomplete. Ideally, a PubMed search on the trial registration number would reveal all study-related articles, but a recent analysis showed that about 40% of journal publications failed to include registration numbers. And grey literature results--such as trial-specific press releases or company statements, information found on the websites of pharmaceutical companies, and abstracts of poster/platform presentations at scientific meetings--are not permanent.

"Creating a single global registry would solve many of the problems we describe here," he continues. "However, international politics and funding limitations suggest this is a challenging goal. Despite its flaws, ICTRP does at least offer a single search portal."

The investigators offer several suggestions for improving the current situation. In addition to the simple remedy of including trial registration numbers on all meeting abstracts and peer-reviewed papers, they propose specific strategies to identify multiply-registered studies and ensuring accurate pairing of results and publications.

"Compliance might improve, especially for difficult-to-publish 'negative' studies, if posting results on trial registries could be made simpler and uniform. Alternative solutions to the problems of publication bias and selective reporting should also be explored. These might involve including journals specializing in publishing 'negative' results, creating user-friendly and publicly available databases to publish results, and raising the awareness of authors, reviewers, and editors about these issues," concludes Dr. Rowbotham.


Story Source:

Materials provided by Elsevier Health Sciences. Note: Content may be edited for style and length.


Journal Reference:

  1. John T. Farrar, Andrea B. Troxel, Kevin Haynes, Ian Gilron, Robert D. Kerns, Nathaniel P. Katz, Bob A. Rappaport, Michael C. Rowbotham, Ann M. Tierney, Dennis C. Turk, Robert H. Dworkin. The Effect of Variability in the 7-day Baseline Pain Diary on the Assay Sensitivity of Neuropathic Pain Randomized Clinical Trials: An ACTTION Study. PAIN®, 2014; DOI: 10.1016/j.pain.2014.05.009

Cite This Page:

Elsevier Health Sciences. "Outcome data in clinical trials reported inadequately, inconsistently, study finds." ScienceDaily. ScienceDaily, 14 May 2014. <www.sciencedaily.com/releases/2014/05/140514133601.htm>.
Elsevier Health Sciences. (2014, May 14). Outcome data in clinical trials reported inadequately, inconsistently, study finds. ScienceDaily. Retrieved October 14, 2024 from www.sciencedaily.com/releases/2014/05/140514133601.htm
Elsevier Health Sciences. "Outcome data in clinical trials reported inadequately, inconsistently, study finds." ScienceDaily. www.sciencedaily.com/releases/2014/05/140514133601.htm (accessed October 14, 2024).

Explore More

from ScienceDaily

RELATED STORIES