Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Professor Guy Goodwin, President of the European College of Neuropsychopharmacology (ECNP), investigates not so surprising findings on the 'reproducibility of psychology experiments'.

'At the end of August, Science published a major study on the reproducibility of psychology experiments. As the authors point out, replication is the defining feature of science. Speaking personally, it is what attracted me to it in the first place. So much else in the politics of everyday life is Neanderthal by comparison. The struggles of hunter gathers to gain power over each other and feed their families implicates us all. The scientific method offers relief from politics, ideology, religion, and, more positively, the path to a kind of truth. So it comes as an undeniable shock to find out how much of what gets published as science is actually false.

The authors of this report conducted replications of 100 experimental and correlational studies published in three psychology journals. They took great efforts to optimize their experimental practice in the replication. The measured effects were half the size of the original effects and only 36% of the replications reached statistical significance. Combining the results of the two experiments gave 68% statistically significant. So is the glass 68% full or 32% empty? And does it matter?

This problem has not been as widely appreciated as it should have been considering that John Ioannidis published a paper a decade ago (PLOS Med. 2, e124 (2005)) with the best title ever: ‘Why most published research findings are false’. He has been cited over 3000 times, but maybe fewer people have actually read and understood his point. Concern about valid statistical analysis and reproducibility has also influenced clinical trials practice for much longer than a decade. Currently failure to replicate seems to be a problem that most concerns laboratory biomedical science, psychology and experimental medicine.

The publishing culture of science receives much of the blame for the present state of affairs, and it will be interesting to see if some sort of regulation can reduce the recognized bias without making publication more slow and painful than it already is. Maybe publishing has to become less about individual papers in a print volume and more about how we synthesize and map progress for a whole field using information technology (that may have to be invented for the purpose). Genomics may be the motor for such a change: masses of shared data, making sense of complex phenotypes, applying discoveries to innovative treatment.

The problems for industry are very similar. There is no point in patenting drugs on the basis of science that does not replicate. Furthermore, patent attacks on successful drugs (usually by pharmaceutical companies who want to market generic versions of drugs nearing the end of their patent life) can focus on any discrepancy between what the data promised and what the drug delivered. In this respect, industry based scientists have probably been under more pressure to improve practice than their peers in academia. They can probably teach academic centres something important in how to reduce the more obvious risks.

ECNP provides the meeting between academia and industry where solutions or developments can take place in this area. We should make good practice central to the training we offer junior scientists and clinicians and there may be scope to develop a new network initiative that looks more deeply into possible solutions to the bigger challenge of data quality and data synthesis. At the end of the day, science still works, but we now understand how inefficient and misleading some of our current practices have become, and that is also how science works.'

This is kindly reproduced from an open message from Guy Goodwin to the ECNP


Please follow the link below to read the news on the NIHR BRC website.