Improving the replicability of results from a single psychological experiment
We identify two aspects of selective inference as major obstacles for replicability. The practice of highlighting a subset of statistical results without taking into consideration the multiple comparisons made in the analysis from which they were selected. The file-drawer effect, the tendency to only publish statistically significant results. We propose to address the first issue by controlling the FDR using the hierarchical Benjamini-Hochberg procedure of Benjamini and Bogomolov. To address the second issue, we propose constructing confidence intervals and estimators that are conditioned on passing a threshold level of statistical significance. We apply our proposed methodologies to the 100 experimental psychology studies for which replication was tested as part of the Reproducibility Project in Psychology (RPP). We showed that these two simple-to-use tools can enhance the replicability of published findings without sacrificing statistical power, and are essential even when adhering to alternative methods proposed for addressing the replicability crisis in psychology.
READ FULL TEXT