Checking Data Analysis: Steve’s Campaign
Computational reproducibility is a mouthful but the idea is simple: Use the reported data analysis to get the reported results. Haven’t we always assumed that works? Yes, but.
Search for “computational reproducibility” and find recent discussions in numerous disciplines. Conclusion: It’s often very difficult and we need new practices and tools.
Steve Lindsay is campaigning for psychology journals to do more to ensure that reported results do indeed arise, as claimed, from the data. His guest editorial in Meta-Psychology is here.


Steve, a highly experienced journal editor, has long worked to advance Open Science. (See my post here.) He makes a cogent argument as he discusses practical ways to achieve computational reproducibility without simply expecting already over-worked voluntary expert reviewers to do the checking.


Good news is arrival of a new Open Science badge to signal that an article provides full details of the data analysis including openly available analysis script or code.
Badge requirements are here and graphics here.
Steve’s editorial is worth a read. May it help change the world.
Geoff
Leave a Reply