Buckley, Yvonne M. and Bardgett, Richard and Gordon, Rowena and Iler, Amy and Mariotte, Pierre and Ponton, Samantha and Hector, Andy (2025) Using dynamic documents to mend cracks in the reproducible research pipeline. Journal of Ecology, 113 (2). pp. 270-274. ISSN 0022-0477
JEcol-2024-1172_Proof_hi.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (211kB)
Abstract
There is a research reproducibility crisis, including in ecology. The research pipeline from conception to publication has many cracks, which means that it may not be possible to repeat and verify published results. Reproducibility means that the results of a study can be reproduced from the original data. It is a critical step in the quality assurance of a study; indeed, the re-use and subsequent citation of methods from reproducible research can increase the impact of the work beyond the findings of the specific study. Given the original data, code and documentation, in theory, all research results could be reproduced. However, sufficient information must be available to understand and reproduce the data handling, analysis and modelling. Information should also be accessible, enabling reproduction with reasonable effort. Various open-source software options exist that allow scientists to easily annotate their scripts in a way that makes it simple to produce dynamic documents that give a more accessible account of the analysis (html, pdf and various word processor file types). Popular software options—including Jupyter notebooks, the R markdown package and the new multi-language Quarto application—produce documents that weave together the input code and software-generated output (text, tables, and figures) with the author's explanatory text to produce a clear narrative of the analysis process. Therefore, we now encourage the submission of supplementary dynamic documents to the Journal of Ecology to improve the reproducibility and transparency of research published in the journal. Reproducibility can be assessed prior to the submission of the work for publication, during peer review and post-publication. Authors are encouraged to provide three file types: the data, an executable dynamic document and a static reproducibility PDF file that integrates and annotates the input code with the statistical output. We provide some basic examples of dynamic documents for reproducibility.