Healthy skepticism needed for some scientific claims.
Independent researchers couldn’t reproduce the findings of more than half of 100 experiments previously published in three prominent psychology journals, a new review reports.
This review should fuel skepticism over scientific claims, particularly if those claims are based on shaky statistics, said one of the new study’s authors, Brian Nosek, a professor of psychology at the University of Virginia. Nosek is also executive director of the Center for Open Science, the non-profit group that coordinated the project.
Only 47 percent of the follow-up studies were able to reproduce the same effects of the original studies, the review found. The strength of findings found during original studies also appeared to diminish when successfully replicated, Nosek said.
The new review also calls into question the statistics used in the original studies. About 97 percent of the original studies showed a statistically significant result, but only 36 percent of the replication studies did the same.
“Reproducibility is a central feature of science,” Nosek said. “A scientific claim doesn’t become believable because of the status or authority of the person that generated it. Credibility of the claim depends in part on the repeatability of its supporting evidence.”
The study was published in the Aug. 28 issue of Science.
People reading about new studies should approach them with a skeptical eye and the understanding that each new finding is just a small addition to a vast scientific panorama that is constantly growing and shifting, said Stephen Lindsay, a professor of psychology at the University of Victoria in British Columbia, and editor of one of the journals reviewed, Psychological Science.
“I think the really important thing the public needs to understand is that they should be very skeptical about results they hear from a single study, unless it’s a really huge study that’s done in a very impressive way,” Lindsay said.
Nosek agreed. “That’s the reality of science — we’re going to get lots of different competing pieces of information as we study difficult problems,” he said. “We’re studying them because we don’t understand them, and so we need to put in a lot of energy in order to figure out what’s going on, and it’s murky for a long time before answers emerge.”
Based on these findings, academic journals need to take a harder-nosed approach to articles they’re considering for publication, said Alan Kraut, executive director of the Association for Psychological Science, which publishes one of the journals involved in this study.
“We’ve changed how articles are published in our flagship journal, Psychological Science, changes that encourage greater transparency, stronger statistical analyses, and provide special recognition for preregistering hypotheses and for sharing materials and data,” Kraut said.
The new review included studies published in 2008 in the journals Psychological Science, Journal of Personality and Social Psychology, and Journal of Experimental Psychology: Learning, Memory and Cognition.
More than 270 researchers from all over the world agreed to take on someone else’s earlier experiment and do it over again, to see if they could produce the same results.
This initial effort focused on psychology because members of the team that started the project are psychologists, Nosek said. But, Nosek’s center has since started working on a similar project in cancer biology, and hopes to expand to other fields in the future.
“There are reasons to expect that there might be similar issues across disciplines, since the incentives driving individual scientist’s behavior are very similar across disciplines. It’s a very competitive marketplace in all research disciplines,” Nosek said.
Lindsay added that “the pressure to publish keeps ramping up and up and up, and the criteria for acceptance in these major journals has been more and more surprising results, more and more interesting results.”
Marcia McNutt, editor-in-chief of Science, said that “studies like this are going to help lead to a better understanding of the level of quality control and documentation that facilitates reproducible research.”
More than 500 scientific journals have signed onto a set of standards presented by Science earlier this year, which call for promotion of transparency and openness in the process by which studies get published, McNutt said.
But McNutt added that the results of this new review should not prompt broad skepticism aimed at science in general, and doesn’t necessarily invalidate all of the findings published in the original papers.
“It’s so important for everyone to remember that just because a result is reproducible does not necessarily make it right. There are many examples of results that were completely reproducible and yet, were fundamentally wrong,” she said.
“And the failure of a result to be reproduced does not necessarily make it wrong,” she added.
SOURCES: Brian Nosek, Ph.D., professor of psychology, University of Virginia and executive director, Center for Open Science, Charlottesville, Va.; Stephen Lindsay, Ph.D., professor of psychology, University of Victoria, British Columbia, and editor, Psychological Science; Alan Kraut, Ph.D., executive director, Association for Psychological Science; Marcia McNutt, Ph.D., editor-in-chief, Science; Aug. 28, 2015, Science
Written for Health Day News and published at Medline Plus, August 27, 2015.
FAIR USE NOTICE: This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U. S. C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www. law. cornell. edu/uscode/17/107. shtml“