New! Sign up for our free email newsletter.
Science News
from research organizations

Fundamental Flaws In Many Medical Studies Must Be Fixed In Order To Catch Signs Of Harm

Date:
November 11, 2005
Source:
University of Michigan Health System
Summary:
The arthritis drug Vioxx eased the pain of millions of patients -- but also increased heart attack and stroke risk among some of them. Such problems may often go unnoticed at first, but might be prevented if medical researchers changed the way they evaluate new medicines, medical devices and other treatments, according to a new study.
Share:
FULL STORY

The arthritis drug Vioxx eased the pain of millions of patients -- but it also greatly increased the risk of heart attack and stroke among some of them. And that extra risk only came to light through after-the-fact analysis of data from Vioxx research studies.

Failure to identify such medical safety problems may often go unnoticed but could frequently be prevented if medical researchers changed the way they evaluate new medicines, medical devices and other treatments, according to a new paper in the journal Health Affairs. Such changes may have to be mandated, say the authors from the Veterans Affairs Ann Arbor Healthcare System and the University of Michigan Health System.

The paper's authors say there are fundamental flaws in the way researchers usually analyze and report the results of medical studies, especially randomized clinical trials that are seen as the "gold standard" method for studying the effectiveness and safety of new treatments. Those flaws make it more likely that they'll miss signs of extra harm -- or extra benefit -- to subgroups of patients who have certain characteristics.

Only by analyzing results using a statistical method that considers several factors at once -- for instance, overall heart attack risk, rather than just individual risk factors such as age -- will researchers truly be able to see how the risks and benefits of a treatment vary from patient to patient, the study suggests.

This kind of analysis, called multivariable or risk-stratified analysis, would require more statistical know-how. But it could save lives and money, and allow patients to weigh a treatment's true risks and benefits to their health before starting treatment.

"Most studies currently emphasize the average risk and average benefit found in the study, but the average trial participant might get much less benefit than average, or even be harmed," says lead author Rodney Hayward, M.D. "If 9 people are in a room with Bill Gates, the average net worth of people in the room will be several billion dollars even if everyone else in the room is in serious debt.

"Similarly, in a clinical trial, if a small group of patients gets a very large amount of benefit, the treatment can have average benefit even when most people in the trial received no benefit and some are harmed," he says. "We believe our study clearly demonstrates that the current way medical treatments are evaluated is inadequate to detect this problem, that this problem is not rare, and that mandating risk-stratified analysis has the potential to decrease the risk of promoting wasteful and unsafe treatments." Hayward directs the VA Center for Practice Management and Outcomes Research and is a professor of medicine and public health at U-M.

In the paper, he and his colleagues present real data from clinical studies, and hypothetical simulation models. They also report what they found in a review of papers in major medical journals, which showed that only 4 percent of clinical trials used risk-stratified analysis.

Among the treatments that should have been studied with risk-stratified analysis, they show, is the clot-busting drug tPA. A major 1993 study showed that heart attack patients who received it were much less likely to die than those who got another drug.

But when Hayward's colleague David M. Kent, M.D., M.Sc., now at Tufts University, analyzed the data from this study in a risk-stratified way, he found major differences in effectiveness of tPA. In fact, his analysis shows that 25 percent of the patients in the original study accounted for more than 60 percent of all the benefit in the entire study. Meanwhile, half the patients received little or no benefit -- and some had such a high risk of brain bleeding from tPA that there was net harm.

"A risk-stratified analysis can also find that we are underestimating the benefits of a treatment," says Hayward. For example, a major study of tPA for stroke treatment showed no benefit among any group of patients treated between three and five hours after the start of their stroke. But Kent and others showed by risk-stratified analysis that many patients who had a low risk of brain bleeding did indeed receive significant benefit from tPA even three to five hours after a stroke.

These real-world examples are among those described in the new paper alongside statistical simulations that further illustrate how "average" trial results can be misleading.

"A treatment that decreases risk of stroke by 25 percent will have 100 times more absolute benefit for someone with a one in 10 risk of having a stroke than for someone with a one in 1,000 risk," notes Timothy Hofer, M.D., M.Sc., a U-M associate professor and VA research scientist who worked on the statistical simulation models for the study. "In fact, if that treatment also harms even one in 1,000 people due to serious medication-related adverse events, giving that treatment to the low-risk person you will harm four people for every one person it helps."

"Risk stratification of study results will also provide better information to help doctors talk with their patients about whether or not a certain treatment is right for them, considering their individual circumstances," says Hayward. It could also save health care dollars by decreasing overuse of medical treatments, which have become increasingly expensive. "It is quite possible that a substantial proportion of runaway health care costs is related to advocates using average benefits of treatments to promote use of these treatments to large numbers of people who receive little benefit from them."

Of course, to make risk-stratified analysis possible, it's important to have good models, or statistical tools, that take into account the underlying factors that can make someone more or less prone to certain diseases or health incidents. For instance, in order to tell whether a new drug increases the risk of heart attack, researchers would need to use a model that calculates baseline risk of heart attack considering multiple risk factors, such as age, smoking history, cholesterol, family history, blood pressure, diabetes, past heart-related problems, and other factors.

"There are relatively few cases where one risk factor increases a person's risk of harm or benefit from a treatment more than twofold, but there are many cases where multiple risk factors interact to increase it tenfold," says Hayward. Risk-stratified analysis may give much more trustworthy and useful answers than conventional analysis, Hayward says, but it is rarely done, especially if the researchers find the treatment is beneficial on average.

For the new paper, he and his colleagues reviewed trials published in three top medical journals in the year 2001. Of 108 trials that reported results of major patient outcomes from treatment, such as risk of death, 42 didn't report how the results broke down among subgroups.

Only four studies reported the benefit of the treatment for lower-risk and higher-risk patients at all -- and only one of them used an acceptable risk-stratified analysis.

Since researchers and the companies that fund studies of new treatment aren't using risk-stratified analysis, Hayward says, it may fall to medical journal editors and regulators to make them. "When a treatment has 'average' benefit, advocates for a treatment do not have an incentive to do a risk-stratified analysis, but there are real safety issues at stake if it isn't done," he explains.

He notes at least one incident in which the U.S. Food and Drug Administration required a manufacturer to perform an after-the-fact risk-stratified analysis: the case of drotrecogin, a drug for the deadly bloodstream infection sepsis. When the drug (which cost more than $10,000 per patient at the time) was originally studied, the researchers collected information about the patients that could be entered in a computer model to calculate their chance of dying and report it as a score. But this score was not reported in the original publication drotrecogin's efficacy.

When the FDA required the scores to be used in a new analysis, it was shown that half of the patients who had a low death-risk score didn't get any benefit from the drug. So, the FDA only approved it for use in patients with a high score.

More sophisticated statistical analysis may at first seem daunting to researchers and journal editorial boards not accustomed to such statistical methods, but Hayward notes that doctors in hospital intensive care units use it all the time to calculate patients' prognoses. And health Web sites, including some from the National Institutes of Health, include self-administered questionnaires that allow people to estimate their risk of heart attacks and breast cancer. Modern information technology tools have made it much easier to predict individual risk using a computer or personal digital assistant.

###

In addition to Hayward, Kent and Hofer, the new study's authors included Sandeep Vijan, M.D., M.Sc., an investigator at the VA Ann Arbor Healthcare System and an assistant professor of Medicine at U-M. The research was funded by the Department of Veterans Affairs through the VA Health Services Research and Development Service and the Cooperative Studies Program, and by the National Institutes of Health.


Story Source:

Materials provided by University of Michigan Health System. Note: Content may be edited for style and length.


Cite This Page:

University of Michigan Health System. "Fundamental Flaws In Many Medical Studies Must Be Fixed In Order To Catch Signs Of Harm." ScienceDaily. ScienceDaily, 11 November 2005. <www.sciencedaily.com/releases/2005/11/051111104022.htm>.
University of Michigan Health System. (2005, November 11). Fundamental Flaws In Many Medical Studies Must Be Fixed In Order To Catch Signs Of Harm. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2005/11/051111104022.htm
University of Michigan Health System. "Fundamental Flaws In Many Medical Studies Must Be Fixed In Order To Catch Signs Of Harm." ScienceDaily. www.sciencedaily.com/releases/2005/11/051111104022.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES