Shaping Journal Editorial Policy To Increase Reproducibility and Transparency

Sunday, February 19, 2017: 1:00 PM-2:30 PM
Room 302 (Hynes Convention Center)
D. Stephen Lindsay, University of Victoria, Victoria, BC, Canada
Psychological Science is arguably the most influential of the journals that publish primary research across domains of scientific psychology. The journal has enjoyed a very high profile, but it has also been roundly criticized for too often publishing flashy findings of questionable replicability. My predecessor as Editor in Chief, Eric Eich, introduced a number of initiatives intended to increase the replicability of effects published in Psychological Science, and I have continued and extended those efforts. Examples include (a) adoption of Transparency and Openness Promotion guidelines; (b) awarding badges to encourage/advertise research preregistration, data posting, and materials posting; (c) removed word-count limits on Method sections and required detailed reports of relevant statistical information; (d) use of StatCheck on submissions to detect inconsistencies in reports of common statistical tests; (e) creation of a team of Statistical Advisors to assist action editors and the Editor in Chief on statistical issues; and (f) publication of action editor’s name with article. I also published an editorial in the journal titled “Replication in Psychological Science” that attempted to explain in a clear and effective way some of the major threats to replicability and offered specific guidelines on enhancing replicability. One promising indicator is that the proportion of articles for which the data and/or materials have been posted online has soared, far surpassing comparison journals.

I believe that promoting preregistration is one of the most powerful ways to increase replicability. Whereas nearly half of our articles have posted data and/or materials on repositories, the percentage of articles reporting preregistered research is still in the single digits. Of course, preregistration by definition has to be completed before data are analyzed, so it is inevitable that uptake of this practice will be more gradual than uptake of posting data or stimuli (which authors can decide to do at the last minutes when submitting a manuscript or any time up until it is published). I am working to promote preregistration in several ways, including a recent article coauthored with Dan Simons and Scott Lilienfeld, in the APS Observer, http://www.psychologicalscience.org/observer/research-preregistration-101#.WEB4sOYrLRY More concretely, I encourage my action editors, when they invite revisions with new data, to stipulate that the follow-up study be preregistered. Those efforts are already bearing fruit.