Of 100 Published Psychology Studies, Less Than Half Could Be Reproduced Successfully

Prepare your grains of salt

One of the pillars of scientific research–perhaps the one that makes science as definitive as it is–is that any study should be capable of being repeated under the same methods and conditions and if the research holds true, the same result will be found every time the experiment is performed–something known as a study’s reproducibility.

A group of researchers found that when they actually tried to reproduce 100 psychology studies, they managed to replicate the results in less than half the cases. Their results were published today in Science, and online as a resource for other scientists at The Reproducibility Project.

These new results piggyback on previous studies that attempted to replicate study results already published in scientific journals. This report also follows in the wake of a highly-publicized psychology study, originally published in Science about attitudes towards same-sex marriage, which was found to be inaccurate, and was retracted from Science in May.

To better understand what leads to reproducibility, a group of 270 psychologists selected 100 psychology studies from three leading journals in the field (Psychological Science, the Journal of Personality and Social Psychology, and the Journal of Experimental Psychology: Learning, Memory, and Cognition) and replicated the studies under the same methods and conditions. In most cases, the methods and conditions were able to be reproduced, however, the results themselves were only reproduced in less than half the cases, the researchers report. When they analyzed the results, they found that 97 percent of the original studies had statistically significant results but when the studies were replicated, only 36 percent of those new results were statistically significant.

The researchers note that not being able to reproduce the same results does not necessarily discredit the original study–and that was not the point of their study–but it does help to determine which factors will help predict the likelihood that a study can be reproduced with the same results.

They found that the best way to predict whether a study’s results will be replicated successfully was to look at the p-value. In statistics, the p-value helps determine the significance of the study and decide whether or not the results prove the study’s hypothesis. In technical terms, it’s the probability that the results of their study could be due simply to random chance. The higher that p-value is, the more likely it is that the results are due to chance and the less likely it is that they are significant, or prove the hypothesis of the study.

The researchers found that studies with p-values closer to 0.05–that’s 5 percent likely to be due to random chance, the maximum cutoff point to claim that results are significant–were less likely to be replicated successfully, whereas ones with p-values closer to zero were more likely to be replicated with success.

This is the first open, systematic study of reproducibility in psychology studies, the researchers write, and they say they cannot extrapolate to studies across other disciplines. However, the main point of their study was to identify the factors that may help predict the likelihood that a study’s results can be reproduced successfully. They published their results online in an open database with the hope that other scientists will take part in their effort to ensure studies are reproducible–and the conclusions that follow are accurate and reliable.

Claire Maldarelli
Claire Maldarelli

is the Science Editor at Popular Science. She has a particular interest in brain science, the microbiome, and human physiology. In addition to Popular Science, her work has appeared in The New York Times, Scientific American, and Scholastic’s Science World and Super Science magazines, among others. She has a bachelor’s degree in neurobiology from the University of California, Davis and a master’s in science journalism from New York University's Science, Health, and Environmental Reporting Program. Contact the author here.