Thoughts on Meta-Analyses
"Meta-analysis is to analysis, what metaphysics is to physics."—Richard D. Feinman
We know that the vast majority of medico-scientific research is bogus. Estimates vary, but conservative ones go from a majority to 75% to more.
“In medicine, the drug companies Bayer and Amgen, frustrated by the slow progress of drug development, discovered that more than three-quarters of the basic science studies they were relying on didn’t stand up when repeated.”
If this is the case, and there's no reason to think it's not, the reasonable approach to take is to try and filter out what's bogus.
The absolute worst approach would be to combine these studies, bogus and valid, into a meta-analysis—a study looking to combine the findings of disparate, often disagreeing, papers.
Yet that is what we do.
Hooper 2020, from Cochrane , a paper looking at the effects of fat on CVD, used a paper by a researcher thought to be a fraud.
This was known to the researchers at the time they wrote the review, as they had mentioned the fact previously! (Hooper 2018).
This post by @ZahcM goes over it in depth.
“Saturated Fat, Cochrane, and the Houtsmuller Trial: A Case of Research Fraud?”
And it’s not like we’re the only ones noticing this.
“I study bias in the design, conduct and publication of research. I have been part of a years-long initiative to exclude fraudulent studies from Cochrane’s reviews…. Cochrane’s research-integrity team is only two people, working part time—not enough to investigate every possible fraudulent study.”
But that's not the worst.
"As he described in a webinar last week, Ian Roberts, professor of epidemiology at the London School of Hygiene & Tropical Medicine, began to have doubts about the honest reporting of trials after a colleague asked if he knew that his systematic review showing the mannitol halved death from head injury was based on trials that had never happened. He didn’t, but he set about investigating the trials and confirmed that they hadn’t ever happened. They all had a lead author who purported to come from an institution that didn’t exist and who killed himself a few years later. The trials were all published in prestigious neurosurgery journals and had multiple co-authors. None of the co-authors had contributed patients to the trials, and some didn’t know that they were co-authors until after the trials were published. When Roberts contacted one of the journals the editor responded that “I wouldn’t trust the data.” Why, Roberts wondered, did he publish the trial? None of the trials have been retracted."
The answer to the title of that article is "Yes", certainly as regards meta-analyses.
“Time to Assume that Health Research is Fraudulent until Proven Otherwise?”
So if your response to some study is a meta-analysis, that’s not very convincing. Almost uniformly, the people doing meta-analyses are not the experts in the field, and the inclusion, exclusion, or the very premise of the study, as in Hooper 2020, may be flawed.
In my merger-arbitrage days, we had a joke concerning many mergers:
"They're tying two rocks together and hoping they float."
“Later Roberts, who headed one of the Cochrane groups, did a systematic review of colloids versus crystalloids only to discover again that many of the trials that were included in the review could not be trusted. He is now sceptical about all systematic reviews, particularly those that are mostly reviews of multiple small trials. He compared the original idea of systematic reviews as searching for diamonds, knowledge that was available if brought together in systematic reviews; now he thinks of systematic reviewing as searching through rubbish. He proposed that small, single centre trials should be discarded, not combined in systematic reviews.”
It's no less true of meta-analyses.
GIGO. John Ioannidis is another who has pointed out that much published research is untrue and/or unreproducible.
I seem to remember that Malcolm Kendrick is very critical of meta-analysis. Yet I see frequent references online that they are a "gold standard" second only to RCT. It seems to be a useful way to add another obfuscating layer of statistical manipulation. Not to mention relatively inexpensive — no test tubes or live subjects needed, just a computer.
Hans Jürgen Eyseneck 25 years ago: ‘If a medical treatment has an effect so recondite and obscure as to require meta-analysis to establish it, I would not be happy to have it used on me’. https://academic.oup.com/eurheartj/article/40/40/3290/5594072