IN 2007, the US enacted a law requiring the results of clinical trials to be published within one year of completion.
It was a no-brainer. Clinicians and members of the public need access to trial results to stay up to date on likely benefits and harms of treatments. Researchers need to know what others have found to avoid needless duplication of research and identify the best targets for further inquiry.
Good to have that sorted then.
In the absence of enforcement, many trial sponsors ignored the new legal requirement set out in the Food and Drug Administration (FDA) Amendment Act, posting trial results erratically or not at all.
It took a decade for US regulators to issue a long-awaited final rule clarifying expectations under the law and penalties for non-compliance, including potential fines of more than US$10 000 per day for sponsors who did not publish results on time.
That rule came into full effect on 18 January 2018, 2 years ago this month.
Did it fix the problem? Apparently not.
Results are supposed to be published on the ClinicalTrials.gov website, the largest trial register in the world.
Researchers from the University of Oxford’s DataLab analysed compliance for the more than 4000 trials in the register to which the final rule applied and found only 41% reported results within the one-year deadline.
When those who reported late were included, the percentage rose to 64%. In other words, more than a third of eligible registered trials failed to report results at all.
Overall, compliance is poor and not improving, the DataLab authors write in The Lancet.
“Clinical trials are not abstract research projects,” they write. “They are large, expensive, practical evaluations that aim to directly inform clinical practice.”
Missing trial data undermines the reliability of clinical guidelines and systematic reviews, they argue, making it impossible for patients and clinicians to make informed choices.
The DataLab team had earlier identified a similar failure to report results for trials on the European register, despite a legal requirement to do so.
Given the dominance of the US and Europe in clinical research, this is a disturbing situation.
Interestingly, the list of worst offenders in the latest study is mostly composed of not-for-profit and US government-sponsored trials.
Some major research organisations have exceptionally low reporting rates, including the Mayo Clinic (21%), the MD Anderson Cancer Center (34%) and the University of California San Francisco (16%).
In contrast, the pharmaceutical industry comes out fairly well under scrutiny, with some of the biggest operators achieving reporting rates of close to 100%.
That would appear to be a real turnaround from 5 years ago when I wrote about much lower reporting rates by some major pharmaceutical companies.
It’s interesting to wonder why big pharma might have taken the reporting requirements seriously while non-commercial research operations have largely ignored them.
Perhaps the industry sees the consequences of non-compliance as a real financial and reputational threat, while those in the not-for-profit sector don’t.
University researchers may well be comforted by the thought that the FDA isn’t going to fine them.
And, so far, they’d be right. The FDA has not yet issued a single one of those $10 000 a day fines, despite the widespread non-compliance.
In response to a question from the DataLab researchers, the agency said it preferred to encourage voluntary compliance.
Nobody would want to see non-commercial medical research crippled by huge fines, but it does seem the regulator requires some teeth if it is to effect change.
Researchers and the organisations they work for need to step up and take their ethical responsibility to report results seriously.
The benefits of sharing this kind of knowledge are clear.
The failure to share is a betrayal of trial participants who have given up their time, and perhaps risked their health, in the belief this will contribute to the sum of knowledge about the condition being studied.
Jane McCredie is a health and science writer based in Sydney.
The statements or opinions expressed in this article reflect the views of the authors and do not represent the official policy of the AMA, the MJA or InSight+ unless so stated.