– or why there’s no point doing a systematic review if you don’t report it properly.
So this is going to be a little bit of rant, plus a bit of pleading, and a call to arms.
- What’s the point of doing any form of research if you report it so inadequately that people are almost morally obliged to ignore it?
- Or if you are so selective in your reporting that it gives a completely wrong impression of the results?
- The question of not reporting it at all (non-publication of PhDs, for example) is for another day.
There have been plenty of examples where poor reporting of randomised controlled trials has resulted in a false understanding of the efficacy of a treatment. Think Tamiflu or Reboxetine. Both of these, and plenty more like them are the reason behind AllTrials, (and all power to it.)
But what about the quality of reporting of systematic reviews? “Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users.” The methodology of these needs to be as clearly and comprehensively reported as clinical trials. Let me tell you a story…
A group of librarians from Spain, France and Switzerland met at an EAHIL conference, and decided to do a little research around Evaluating the information retrieval quality and methodological accuracy of Systematic Reviews and Meta-analysis on congenital malformations (2004-2014)
This work was presented by Alicia Fátima Gómez @
The topic of the papers was, I think, just a way of keeping the workload manageable (analysing 162 papers is enough of a workload on top of your day job), and so too was the range of years covered, but the point of it was to see if any of the SRs or MAs actually conformed to the reporting guidelines that are available, and was a contribution by a librarian to the whole process visible.
It produced some very interesting (depressing?) results. Here’s just a few of them…..
- 80% of paper did NOT mention PICO
- 68% of the searches were NOT fully described & transparent (absolutely essential for any reproduction of the work in the future)
- 66% did NOT use controlled vocabulary (MeSH, etc)
- 48% did NOT explicitly use synonyms (put this together with the previous point, and I don’t actually think it equates to a literature search, far less a systematic review!)
- 49% did NOT recognise any risk of bias in their work (eg language limitations)
- most studies only used one database – a significant source of bias, and a virtual guarantee that relevant papers will be missed.
- less than 10% of papers mentioned a contribution by a librarian
This is a relatively small study but alarm bells are clanging like crazy.
The last point is particularly galling when you consider the work by Reflethsen et al “Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews”
“Problems remain with SR search quality and reporting. SRs with librarian or information specialist co-authors are correlated with significantly higher quality reported search strategies. To minimize bias in SRs, authors and editors could encourage librarian engagement in SRs including authorship as a potential way to help improve documentation of the search strategy.”
We can help researchers publish better work!!
There are some tremendous reporting guidelines for all sorts of research (they’re pulled together in one place by EquatorNetwork (and all power to them!)) and plenty of specific advice around reporting of systematic reviews:
and yet there’s appears to be very little use made of these guidelines by editorial boards and peer reviewers when assessing the work that is submitted for publication.
So, all of this was why on Friday, after a hard few days on conferencing at EAHIL 2016, and with the aid of @tomroper, I asked that EAHIL act. The full text is available here. The motion was carried, and I’m looking forward to seeing what we can do to improve the situation.
PS – also delighted to see:
- Epidemiology and Reporting Characteristics of Systematic Reviews of Biomedical Research: A Cross-Sectional Study: “An increasing number of SRs are being published, and many are poorly conducted and reported. Strategies are needed to help reduce this avoidable waste in research.”
- Impact of interventions to improve the quality of peer review of biomedical journals: a systematic review and meta-analysis “Despite the essential role of peer review, only a few interventions have been assessed in randomized controlled trials. Evidence-based peer review needs to be developed in biomedical journals.“