A lot of researchers worship the scientific method in their experiments, but many times, simple mistakes or the desire for prestige might interfere with the accuracy of the results. According to a study conducted by the Open Science Collaboration, many psychology experiments cannot be replicated with the same results.
Conducted by a team of 270 researchers, the research reanalyzed 100 studies that have already been done, suggesting that an objective point of view and fresh eyes might reveals some big flaws in the current procedures. The team is trying to understand the ways in which researchers might be getting in the way of their own work.
Published in the journal Science, the results might prove discouraging, seeing that only less than half of the original results could be reproduced by the team. Similar results were replicated in only 36 percent of the studies, while the originals scored 97 percent in the significance of the results produced by the scientific method.
But more important than the results of these studies is the framework that made it possible for scientists to successfully replicate the researches, which might point to a revolution in the way scientific studies are conducted and reviewed.
The Open Science Collaboration is what made it possible for 270 co-authors and 86 volunteers to replicate the original studies – all of which were published by three esteemed psychology journals back in 2008.
This impressive level of coordination is the result of the Open Science Framework, an open-source software that allows researchers to compile materials, observe patterns and analyze data. Psychologist Brian Nosek, head of the study and CEO of the Center for Open Science, explained the framework is an important tool in increasing transparency and reproducibility.
Nosek’s team of researchers shares the same vision of scientific utopia, where scientists are evaluated by the quality of their research in spite of their results.
In the end, studies shouldn’t be conducted for earthly fame, they say, because it creates a culture that encourages the scientific publishing system to be more interested in novel, positive results. Instead of inspiring researchers to be methodological and statistical in their experiments, it primes them to produce studies with more exciting outcomes.
The Open Science Collaboration also tries to raise awareness on replication of studies, urging researchers to be more responsible in monitoring their reproducibility. Many of the original 100 studies covered by this analysis started compiling data as early as 2006, which meant that the technology used to stock materials was of the same age.
Image Source: Karolina Journal