A modest proposal for the science media (2)September 17, 2013 at 11:28 am | Posted in Uncategorized | 1 Comment
Tags: Alzheimer's disease, Fish oil, health reporting, Institute for Food Brain and Behaviour, Jonathan Swift, nutrition, Omega-3 fatty acid, Prostate cancer, science journalism, science media
(Feel free to take it in the spirit of the great Jonathan Swift’s original.)
What prompted the proposal
Most mornings, I say hello to the Internet. In return, I find a slew of press releases, heralding recent, or even advance, publications in neuroscience, psychology, medicine and health. Most of it’s from just-published studies, dolled up by press officers and spoonfed to the media. And, as the UK’s Astronomer Royal Martin Rees has implied, a lot of it’s tosh.
I tell my students that it’s better to read first-rate science fiction than second-rate science. It’s more stimulating, and no more likely to be wrong.
In my previous post, I discussed two examples of high-profile science. One, a study published in the Journal of the National Cancer Institute, proposed a link between omega-3 fatty acids and prostate cancer. The second, in the prestigious journal PNAS, proposed a link between copper (Cu) and Alzheimer’s disease (AD). They triggered this particular piece of devil’s advocacy, but they’re only two examples of very many, and my beef is not with them, but with the system that produced them.
A modest proposal for reform
No media organisation with more than 10,000 regular readers should be allowed to publish news of any scientific research until one year has passed from the date of first publication, or the study has been successfully replicated.
The omega-3s/cancer and copper/Alzheimer’s studies were widely reported. What did this high profile achieve for the general public?
Many will have missed, ignored, or instantly forgotten the news. Some of the rest, however, may have worried about whether they should change their diet. They may have asked their doctors about it, or wasted time surfing dubious Internet sites. A few may even have used the study as an excuse for not eating more fruit and veg. Others may have thought, crossly, that they wish the bloody scientists would make their minds up, or remarked that you can’t believe anything you read in the media these days.
I’d be willing to bet that very few will have rejoiced at the extent of their new empowerment, thanked the Press for bringing them the truth so quickly, and happily formulated a new, fish- and fruit-free diet in order to live long into healthy old age.
In other words, we may have a slight increase in anxiety, cynicism and distrust of science, but on the positive side we have … what, exactly? The thrill of novelty. Pages filled, buttons pressed, teeny-weeny neurotransmitter hits delivered. Readers fooled into thinking this particular organisation is hovering at the cutting edge and holding the boffins to account. And this benefits the public – how?
It benefits the media. It also benefits the authors and their institutions, given the current insistence of funding bodies on generating ‘impact’. But science, wisely, distrusts new findings and insists on them being repeated before it takes them too seriously. Might not the science-media-reading public benefit from knowing that a similar standard has been applied to what they are being told? We’re already overloaded with data, or at least information. The modest proposal would reduce quantity and boost quality. It should also give journalists more time to do investigative science journalism.
As it is, quite often reporters don’t understand what they’re reading, because they aren’t specialist science journalists, and the reader doesn’t bother glancing past the headline anyway. This isn’t a criticism of either: readers and journalists are busy people, and even experienced scientists can struggle to understand publications in other scientific disciplines.
But scientists are to some extent held to account by the profession’s self-correcting mechanisms, and its longer timescale – though publicity-chasing for ‘impact’ may distort these. In the media, sensational findings are common; reports of how they were discredited are rarer.
Six conditions for publishing science journalism
Obviously the modest proposal would need expansion. For starters, how about the following six conditions to be met before the news is published:
1. At least two independent experts to be consulted about the study’s merits, and their views to be reported on whether the study is mainstream, minority, maverick, nonsense, or dangerous nonsense
a. If both experts class it in the ‘dangerous nonsense’ category, the journalist may consult three more experts
b. If these disagree, the views of all five should be reported
c. If all five concur that it’s dangerous nonsense, the report should be scrapped
2. Similarly, the experts to be asked for their judgement on whether the headline is appropriate, and that judgement either to be taken into account, or at least reported
3. The journalist to have read at least the introduction and discussion sections of the article, not just the press release
4. Potential conflicts of interests to have been checked by the journalist – and not just by reading the authors’ ‘yup, we’re clean’ statement
5. For health studies, a risk assessment has shown that the putative risk to human health (the ‘culprit’) identified in the study is higher than the risk to health which would follow from giving up the culprit – or at least state both numbers in the report
6. Information about the study should, at the very least, include baseline values as well as the size of the effect/risk/change, the type of study, a link to the article, and the number of participants tested.
Or, how about a kitemark?
I try to be a realist, and realistically, the modest proposal is never actually going to happen. Instead, it would be great if someone could set up a kitemark, a signal of quality, for science journalism, administered by an independent body. I don’t think it would be fair to apply it wholesale to organisations, but individual pieces, and major blog posts, why not? If they were required to display the percentage of their articles which have received a kitemark, that might encourage them to aim for a higher percentage.
Come to think of it, why don’t we do that for the whole of journalism, and make the industry fund the jobs required? Better quality and more employment, in one!