Issue 33 / 27 August 2012

WHETHER it’s the “cancer breakthrough” story or the alarmist exposé of a treatment putting “thousands at risk”, it isn’t hard to come up with examples of over-hyped medical stories in the general media.

Since 2004, the Media Doctor website run by the Newcastle Institute of Public Health has been calling journalists to account when they fail to adequately report the evidence or succumb to disease-mongering and other sins.

The gallery of good and bad stories on the site makes for interesting, if somewhat depressing, reading.

“In general, the coverage of new medical treatments in the lay press is regarded as poor and is prone to exaggeration of facts in order to create unnecessary sensationalism”, the site says.

Australia has some very good specialist health reporters but, in under-resourced newsrooms, medical stories can also be handed to a junior with no training or experience in assessing clinical research.

Add to that the parlous financial state of the mainstream media, the time pressures of the 24-hour news cycle, and the constant PR onslaught by commercial interests with a miracle cure to sell, and it’s easy to see how a medical story can fail to get the critical attention it deserves.

So what to do about it? Certainly, reporters should be called to account when they fall for the company line or get the facts wrong, but they’re not the only players in this game.

Ultimately, journalists rely on the expertise of others to help them communicate complex topics to the general public. And when it comes to reporting health stories, that generally means researchers and the medical journals in which they publish their findings.

While we may be well aware of the problem of companies with financial interests promoting new treatments through the general media, it’s easy to overlook the fact that researchers and specialist publications may not be entirely disinterested parties.

Their interests may be less tangible than a straight-out financial benefit — maybe career progression for a researcher or reputation for a journal — but they can still influence how the research is promoted to the general media.

The Media Doctor team, for example, analysed stories about cancer in the Australian media and found most of the “hyperbole and emotive statements” were actually made by researchers rather than journalists.

“This is fantastically significant for the 2800 Australian men who die of the disease every year”, one researcher said.

“If I had a supply now, I’d be giving it out straight away”, was another’s contribution.

Medical journals, too, can be guilty of overselling research findings when they put out press releases designed to get coverage in the general media.

It’s all too easy for an attention-hungry journal to gloss over a study’s limitations or describe findings in terms of relative risk, without reference to the more sober numbers represented by absolute risk.

A study earlier this year found a link between the quality of a journal’s press release and that of news stories about the research.

When the press release gave absolute risk figures, so did 53% of the news stories. If that information was not in the press release, it appeared in only 9% of news stories.

Similarly, 68% of stories mentioned harms of a treatment when these were in the press release, compared with only 24% when they weren’t.

What was intriguing was the researchers’ suggestion that poor-quality press releases might actually be worse than none at all.

On the measures above, news stories performed better when there was no press release than when the release had omitted the information: absolute risk was given in 20% of stories without a press release; harms were mentioned in 36%.

In other words, journalists seem to perform better when they are left to grapple with the research on their own than when the journal does a poor job of interpreting it for them.

MJA senior deputy editor and MJA InSight medical editor Dr Ruth Armstrong says the MJA has built in a number of checks and balances to ensure press releases do not overstate the case for research findings, often at the cost of making the releases “more boring”.

We journalists tend to push researchers to make bigger claims; to dramatise in the interest of the story.

Their job and that of a medical journal is not to join us in that endeavour, but to stick to what the research actually shows — to be steadfast, when necessary, in the embrace of “boring”.

Jane McCredie is a Sydney-based science and medicine writer.

Posted 27 August 2012

One thought on “Jane McCredie: Boring beats beat-up

  1. Patricia says:

    Perhaps the Media Doctor website sometimes does a good job of calling health journalists and researchers to account, but I believe that it itself needs to be more closely examined.

    The “Newcastle Institute of Public Health”, which is behind Media Doctor, is not an Institute of any official government or university standing so I believe the name is misleading. Media Doctor lists the details of the 16 academics and clinicans who are behind the website, which is good, but there is no information about the “Institute”. The “Institute’s” website hasn’t been updated since 2005, and there is no information on the site about what it is, who is in it, and how it is funded. In fact the “Institute” website link goes to a site titled “excellence in functional foods”.

    Amanda Wilson, a main mover behind Media Doctor, quotes the NPS on the Media Doctor site and makes this statement about statins: “The available evidence shows that risks are relatively low and the benefits of the drug [sic] are good”.

    In addition, the site has given 3and a half stars (out of 5)to a recent news article which suggests that “Low-risk people benefit from heart drugs”

    The jury is still very much out on the advisability or otherwise of mass statin prescriptions.

    I question the value of the Media Doctor website.

Leave a Reply

Your email address will not be published. Required fields are marked *