A woman sits at a table and fills out a questionnaire.

An international team of researchers asked other social scientists to assess whether social science studies are reproducible in the form of a stock exchange.

Deci­sion Mar­kets Pre­dict Repli­ca­bil­ity of Online Exper­i­ments

A new study published in Nature Human Behaviour suggests that a tool similar to a stock market could help scientists identify research findings that are likely to hold up to scrutiny. The study, conducted by a group of social scientists from around the world, tested the ability of a “decision market” to predict which results from previously published social science experiments would be successfully replicated.

The decision market focused on 41 experiments that used participants recruited through Amazon Mechanical Turk (MTurk), an online platform that connects researchers with people willing to complete tasks for pay. All of the experiments were originally published in the Proceedings of the National Academy of Sciences (PNAS) between 2015 and 2018. Felix Holzmeister from the Department of Economics at the University of Innsbruck was also involved in the paper.

Buying and selling

In the decision market, 162 social scientists bought and sold shares that represented the likelihood that a given study's findings would replicate. The market was set up so that the studies whose shares ended up with the 12 highest and those with the 12 lowest prices would be selected for replication. Two additional studies were selected randomly.

The results suggest that the decision market was able to successfully distinguish between replicable and non-replicable findings. The percentage of studies that produced statistically significant results in the same direction as the original study was 83% for the studies in the “top 12” price category. In contrast, only 33% of the studies with the lowest decision market prices successfully replicated. “Our study provides a proof of concept that decision markets can be a useful tool for identifying likely true and likely false research findings,” Felix Holzmeister explains.

Overall, replications were successful for 54% of the 26 studies selected for replication. The average effect size observed in the replication studies was 45% of the average original effect size. This overall replication rate is comparable to those found in previous large-scale efforts to replicate findings from social science experiments. The study’s findings have important implications for the way that science is conducted, says Holzmeister: “Decision markets, or similar tools that rely on scientists’ collective wisdom, could offer a principled mechanism to prioritize research efforts and resources, ultimately increasing the efficiency and reliability of scientific progress.”

Paper:
Felix Holzmeister, Magnus Johannesson, Colin F. Camerer, Yiling Chen, Teck-Hua Ho, Suzanne Hoogeveen, Juergen Huber, Noriko Imai, Taisuke Imai, Lawrence Jin, Michael Kirchler, Alexander Ly, Benjamin Mandl, Dylan Manfredi, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Alexandra Sarafoglou, Rene Schwaiger, Eric-Jan Wagenmakers, Viking Waldén, Anna Dreber: Examining the replicability of online experiments selected by a decision market, Nature Human Behaviour 2024, DOI: 10.1038/s41562-024-02062-9, online: https://www.nature.com/articles/s41562-024-02062-9

  • Nach oben scrollen