- when effect sizes are smaller.
- when there is a greater number and lesser pre-selection of tested relationships.
- where there is greater flexibility in designs, definitions, outcomes, and analytical modes.
- when more teams are involved in a scientific field in a chase of statistical significance.
Why Most Research Findings Are False
Previously I wrote about the wide disagreement in the scientific community over the effects of low-level radiation. Both sides of this divide express contempt for the other side’s bias, while the general public has no way to makes sense of the experts’ contradictions and determine where the truth lies.
Some good insights into this dilemma are to be found in the work of Dr. John Ioannidis who published in 2005 a paper called Why Most Published Research Findings Are False. Since then, this paper has been reported on in popular media such as The Atlantic magazine and CTV News in Canada.
So many studies contradict each other, or have to later be retracted, due to several flaws that Dr. Ioannidis has categorized as follows:
A. Confirmation bias
This is the tendency to cherry pick data, design the parameters of experiments in ways that will yield the expected results, then make interpretations and value judgments according to these biases. The CTV article mentioned the satirical paper in the Canadian Medical Association Journal to illustrate the point. The writer concluded that smoking is good for health because it helps marathoners increase lung capacity, boost hemoglobin levels, and lose weight (all true according to real research that was not part of the joke). The satire underscores the point that interpretation and value judgments matter more than the raw data.
B. Not accounting for confounders
If tobacco addiction didn’t exist, many industries would feel that it would have to be invented because it has been the perfect confounder in many lawsuits against industrial polluters. When it comes to research on the health effects of radiation, all findings are confounded by coexistence of chemical pollution.
C. Conflicts of interest
Obvious point. Confirmation bias is created by the money and interests that finance research. In the nuclear industry, there is a large number of high-skilled, high-paying jobs, and a huge financial investments at stake. One would have to be very naïve to believe that these interests haven’t contributed to the production of research findings that find, for example, that the Chernobyl catastrophe had a very minimal impact on human health. Nuclear proponents are also deeply invested in their own positions and the imperatives of the groups they belong to, but compared to the nuclear industry, they have much less at stake. Ultimate victory would not make them wealthy or create lucrative jobs for themselves. In fact, it would free them to do something else with their time. The same cannot be said of supporters of the nuclear industry.
D. Publication bias
This is the tendency to publish only findings that don’t diverge too much from what has been published before by a publication or an institution. Peer review has its obvious advantages, but it also tends to shut out innovative thinkers with radical new ideas. Some researchers are trying to get away from this blockage by crowdsourcing their research findings. They “advocate using the Internet to expose scholarly thinking to the swift collective judgment of a much broader interested audience.”
E. Institutions tend to overhype their studies’ conclusions
Financial pressures and fundraising drives create the temptation to overstate the limitations of research results as they become part of public relations and advertising campaigns.
The general public, reviewers and science journalists often ignore the tentative nature of the conclusions and play up findings if doing so suits their interest or makes for sensational news.
Ioannidis claims that research findings are also likely to be flawed…
In the summary of his paper Dr. Ioannidis states, “Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.” This is the state of affairs in rigorously controlled and reviewed medical research. All of the factors above are sure to be even more pronounced in the social sciences where the scientific method is less strictly applied and studies have more confounding factors that cannot be controlled in a laboratory.
When faced with the wide, irreconcilable disagreements in the scientific community, what are the victims of the Fukushima Daiichi catastrophe to do? If the ICRP says only a few thousand people died from the Chernobyl catastrophe, and the ECRR says it was a million, shall we just split the difference and say it was 490,000?
When I consider the biases of the nuclear industry and anti-nuclear advocates, I tend to think that we could actually ignore all the peer reviewed studies and just listen to the voices of people who lived through the catastrophe. We don’t need forensic evidence to prove to us that the Nazi holocaust really happened. The corroborated stories of the victims, perpetrators and liberating armies tell the story. In the same way, the accounts of medical personnel and patients who have lived in the aftermath of the catastrophe should be the evidence that nails the case shut. I leave the last word to a doctor from Belarus:
“Doctor Smolnikova checks baby Christina's heart through her stethoscope, and advises Valia on the chances of an operation. She has a long list of other patients like them.
‘Those who say there is no link with Chernobyl should open their eyes and look at the medical statistics,’ Doctor Smolnikova says.
She has been the village doctor here since long before the nuclear disaster.
‘Before Chernobyl I'd never seen a child with cancer. Now it's common. I treat many more children now with heart defects and kidney damage. To say it's nothing to do with Chernobyl just isn't honest.’”
by Sarah Rainsford
BBC News, Gomel, Belarus
April 26, 2005