Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling
π§ Skeptical/Critical βπ Appears in:
Plain English Summary
This landmark survey asked how psychology research actually gets done. Nearly 6,000 psychologists were surveyed using Bayesian Truth Serum, a clever method rewarding honest answers. Results were jaw-dropping: 94% admitted at least one shady shortcut. Two-thirds didn't report everything they measured, over half peeked at results before collecting more data, and half cherry-picked which studies to publish. Roughly 1 in 10 appears to have falsified data. These habits aren't slip-ups β they're the norm. This is a cornerstone of the "replication crisis" debate, cutting both ways: the same shortcuts producing false positives could equally undermine failed replications.
Actual Paper Abstract
Cases of clear scientific misconduct have received significant media attention recently, but less flagrantly questionable research practices may be more prevalent and, ultimately, more damaging to the academic enterprise. Using an anonymous elicitation format supplemented by incentives for honest reporting, we surveyed over 2,000 psychologists about their involvement in questionable research practices. The impact of truth-telling incentives on self-admissions of questionable research practices was positive, and this impact was greater for practices that respondents judged to be less defensible. Combining three different estimation methods, we found that the percentage of respondents who have engaged in questionable practices was surprisingly high. This finding suggests that some questionable practices may constitute the prevailing research norm.
Research Notes
Landmark survey empirically grounding the replication crisis. Directly relevant to psi: skeptics cite it as explaining psiβs failure to replicate via QRP contamination; psi proponents note QRPs equally impair skeptical failed-replication studies. Cited by Kennedy papers in this library.
Survey of 5,964 academic psychologists (N=2,155 respondents, 36% response rate) measured prevalence of questionable research practices (QRPs) using Bayesian Truth Serum (BTS) incentives for truthful disclosure. Admissions were surprisingly high: 94% of BTS respondents admitted at least one QRP, including failing to report all dependent measures (66.5%), collecting more data after checking significance (58%), and selectively reporting studies (50%). BTS incentives raised admissions most for less-defensible practices. Geometric-mean estimates suggest ~1 in 10 psychologists has falsified data. Items formed approximate Guttman scale (reproducibility=0.80). Findings suggest QRPs may constitute the de facto scientific norm, with researchers rationalizing borderline behaviors as defensible.
Links
Related Papers
Cites
- False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant β Simmons, Joseph P (2011)
- Why Most Published Research Findings Are False β Ioannidis, John P.A (2005)
- Correcting the Past: Failures to Replicate Psi β Galak, Jeff (2012)
Companion
- Why Psychologists Must Change the Way They Analyze Their Data: The Case of Psi β Wagenmakers, Eric-Jan (2011)
- Registered Reports: A Method to Increase the Credibility of Published Results β Nosek, Brian A (2014)
- Estimating the Reproducibility of Psychological Science β Open Science Collaboration (2015)
- Commentary: Reproducibility in Psychological Science: When Do Psychological Phenomena Exist? β Heino, Matti T. J (2017)
- Replication Unreliability in Psychology: Elusive Phenomena or "Elusive" Statistical Power? β Tressoldi, Patrizio E (2012)
More in Methodology
Paranormal belief, conspiracy endorsement, and positive wellbeing: a network analysis
Planning Falsifiable Confirmatory Research
Addressing Researcher Fraud: Retrospective, Real-Time, and Preventive Strategies β Including Legal Points and Data Management That Prevents Fraud
Quantum Aspects of the Brain-Mind Relationship: A Hypothesis with Supporting Evidence
Paranormal beliefs and cognitive function: A systematic review and assessment of study quality across four decades of research
π Cite this paper
John, Leslie K, Loewenstein, George, Prelec, Drazen (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science. https://doi.org/10.1177/0956797611430953
@article{john_2012_questionable_practices,
title = {Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling},
author = {John, Leslie K and Loewenstein, George and Prelec, Drazen},
year = {2012},
journal = {Psychological Science},
doi = {10.1177/0956797611430953},
}