Skip to main content

Raising the value of research studies in psychological science by increasing the credibility of research reports: the transparent Psi project

📄 Original study
Kekecs, Zoltan, Palfi, Bence, Szaszi, Barnabas, Szecsi, Peter, Zrubka, Mark, Kovacs, Marton, Bakos, Bence E, Cousineau, Denis, Tressoldi, Patrizio, Schmidt, Kathleen, Grassi, Massimo, Evans, Thomas Rhys, Yamada, Yuki, Miller, Jeremy K, Liu, Huanxu, Yonemitsu, Fumiya, Dubrov, Dmitrii, Roer, Jan Philipp, Becker, Marvin, Schnepper, Roxane, Ariga, Atsunori, Arriaga, Patricia, Oliveira, Raquel, Poldver, Nele, Kreegipuu, Kairi, Hall, Braeden, Wiechert, Sera, Verschuere, Bruno, Giran, Kyra, Aczel, Balazs 2023 Current Era precognition

📌 Appears in:

Plain English Summary

Remember Daryl Bem's famous 2011 claim that people can sense the future? This massive project put it to the ultimate test. Twenty-nine experts, including both believers and skeptics, co-designed the study together so nobody could complain about unfair methods. Then ten labs across nine countries ran over 37,000 trials with 2,115 participants. The result? Participants guessed correctly 49.89% of the time, essentially dead-on chance (50%). The statistical evidence strongly favored 'nothing is happening here.' What makes this study special is its extreme transparency: open data, external audits, tamper-proof software, and real-time reporting. The conclusion is striking: Bem's original precognition effect disappears once you lock down researcher wiggle room and publication bias.

Actual Paper Abstract

The low reproducibility rate in social sciences has produced hesitation among researchers in accepting published findings at their face value. Despite the advent of initiatives to increase transparency in research reporting, the field is still lacking tools to verify the credibility of research reports. In the present paper, we describe methodologies that let researchers craft highly credible research and allow their peers to verify this credibility. We demonstrate the application of these methods in a multi-laboratory replication of Bem's Experiment 1 (Bem 2011 J. Pers. Soc. Psychol. 100, 407–425. (doi:10.1037/a0021524)) on extrasensory perception (ESP), which was co-designed by a consensus panel including both proponents and opponents of Bem's original hypothesis. In the study we applied direct data deposition in combination with born-open data and real-time research reports to extend transparency to protocol delivery and data collection. We also used piloting, checklists, laboratory logs and video-documented trial sessions to ascertain as-intended protocol delivery, and external research auditors to monitor research integrity. We found 49.89% successful guesses, while Bem reported 53.07% success rate, with the chance level being 50%. Thus, Bem's findings were not replicated in our study. In the paper, we discuss the implementation, feasibility and perceived usefulness of the credibility-enhancing methodologies used throughout the project.

Research Notes

The most methodologically transparent replication of Bem's FTF Exp 1 to date — unique for its adversarial consensus design process involving both proponents and skeptics. Central to Controversy #2 (Bem FTF debate) as it directly tests the 'pure bias theory' prediction that positive psi findings vanish under rigorous controls.

A pre-registered, multi-laboratory replication of Bem's (2011) Experiment 1 on precognition, co-designed by a consensus panel of 29 experts including both proponents and opponents. Across 10 laboratories in 9 countries, 2,115 participants completed 37,836 forced-choice erotic trials. Hit rate was 49.89% (chance = 50%), yielding BF₀₁ ≈ 72 — strong Bayesian evidence for the null model. The 90% HDI for population success rate was [49.57%, 50.40%], with > 99% probability below 50.6%. Credibility-enhancing tools included direct data deposition, born-open data, real-time reports, external audits, and tamper-evident software. The authors conclude that Bem's original effect does not survive when researcher degrees of freedom and publication bias are controlled.

Links

Related Papers

Also by these authors

More in Precognition

📋 Cite this paper
APA
Kekecs, Zoltan, Palfi, Bence, Szaszi, Barnabas, Szecsi, Peter, Zrubka, Mark, Kovacs, Marton, Bakos, Bence E, Cousineau, Denis, Tressoldi, Patrizio, Schmidt, Kathleen, Grassi, Massimo, Evans, Thomas Rhys, Yamada, Yuki, Miller, Jeremy K, Liu, Huanxu, Yonemitsu, Fumiya, Dubrov, Dmitrii, Roer, Jan Philipp, Becker, Marvin, Schnepper, Roxane, Ariga, Atsunori, Arriaga, Patricia, Oliveira, Raquel, Poldver, Nele, Kreegipuu, Kairi, Hall, Braeden, Wiechert, Sera, Verschuere, Bruno, Giran, Kyra, Aczel, Balazs (2023). Raising the value of research studies in psychological science by increasing the credibility of research reports: the transparent Psi project. Royal Society Open Science. https://doi.org/10.1098/rsos.191375
BibTeX
@article{kekecs_2023_transparent_psi,
  title = {Raising the value of research studies in psychological science by increasing the credibility of research reports: the transparent Psi project},
  author = {Kekecs, Zoltan and Palfi, Bence and Szaszi, Barnabas and Szecsi, Peter and Zrubka, Mark and Kovacs, Marton and Bakos, Bence E and Cousineau, Denis and Tressoldi, Patrizio and Schmidt, Kathleen and Grassi, Massimo and Evans, Thomas Rhys and Yamada, Yuki and Miller, Jeremy K and Liu, Huanxu and Yonemitsu, Fumiya and Dubrov, Dmitrii and Roer, Jan Philipp and Becker, Marvin and Schnepper, Roxane and Ariga, Atsunori and Arriaga, Patricia and Oliveira, Raquel and Poldver, Nele and Kreegipuu, Kairi and Hall, Braeden and Wiechert, Sera and Verschuere, Bruno and Giran, Kyra and Aczel, Balazs},
  year = {2023},
  journal = {Royal Society Open Science},
  doi = {10.1098/rsos.191375},
}