There is an excellent chapter (Chapter 9) in an excellent book (How Not To Be Wrong: The Power of Mathematical Thinking by Jordan Ellenberg) that gives a very good description of inherent biases in research – taking John Ioannides’ 2005 paper (“Why Most Published Research Is Wrong”).
The chapter starts with a parable, which is clever and funny…
Imagine yourself a haruspex; that is, your profession is to make predictions about future events by sacrificing sheep and then examining the features of their entrails, especially their livers. You do not, of course, consider your predictions to be reliable merely because you follow the practices commanded by the Etruscan deities. That would be ridiculous. You require evidence. And so you and your colleagues submit all your work to the peer-reviewed International Journal of Haruspicy, which demands without exception that all published results clear the bar of statistical signficance.
Haruspicy, especially rigorous evidence-based haruspicy, is not an easy gig. For one thing, you spend a lot of your time spattered with blood and bile. For another, a lot of your experiments don’t work. You try to use sheep guts to predict the price of Apple stock, and you fail; you try to model Democratic vote share among Hispanics, and you fail; you try to estimate global oil supply, and you fail again. The gods are very picky and it’s not always clear precisely which arrangement of the internal organs and which precise incantations will reliably unlock the future. Sometimes different haruspices run the same experiment and it works for one but not the other—who knows why? It’s frustrating. Some days you feel like chucking it all and going to law school.
But it’s all worth it for those moments of discovery, where everything works, and you find that the texture and protrusions of the liver really do predict the severity of the following year’s flu season, and, with a silent thank-you to the gods, you publish.
— Jordan Ellenberg
It goes on to wend its way through mechanisms (the ‘file drawer’ problem whereby only positive findings are published; problems of p-hacking – and of statistical inference generally) by which even a genuine researcher can find himself taking the first steps down the primrose way to the everlasting bonfire.
I propose that we refer to the High Priests of the Cult of Thermaggeddon as Haruspex Maximus – who somehow claim that their forecasts for the increase in the mean temperature 100 years out has a confidence interval that is less than a tenth of a degree wide… when the forecast for mean temp 10 years out has a wider forecast range than that. Somehow chained uncertainty reduces the forecast envelope – I think it helps if your work is only reviewed by others who think exactly the same as you (I doubt that a similar ‘International Journal of Geocentrism Studies’ would ever have found fault in any papers that fit the data using novel epicycles).
From a comment at 2016/03/21 at 7:08 am
A haruspex warmist,a seer,
Sent some work for review by a peer,
Who replied, “you’ve looked deep,
At the guts of a sheep,
And your climate-change forecasts are clear”.