A Community Model for Rigorous and Inclusive Scholarship

Inaugural Editorial of Replication Research (R2)

Authors

DOI:

https://doi.org/10.17879/replicationresearch-2025-9022

Keywords:

reproduction, replication, metascience, open science, open access

Abstract

Reproducibility and replicability are vital for trustworthy, cumulative research, yet remain undervalued in most areas of academic publishing. Replication Research (R2) is a Diamond Open Access journal dedicated to publishing high-quality reproductions, replications, and related methodological work across disciplines. With robust standards for transparency, open peer review, and social responsibility, R2 offers practical guidance and support for authors. We aim to rebalance research culture by valuing diligence and robustness alongside innovation, thereby increasing confidence in research findings. We invite researchers to contribute to and benefit from an open, community-driven journal designed to elevate the status and impact of replications (repeated studies of published findings with different data) and reproductions (repeated tests of published findings with the same data). In this editorial, we introduce the aims, policies, and scope of Replication Research, outlining how the journal will operate and the values that guide it.

References

Ankel-Peters, J., Brodeur, A., Dreber, A., Johannesson, M., Neubauer, F., & Rose, J. (2025). A protocol for structured robustness reproductions and replicability assessments. Q Open, 5(3). https://doi.org/10.1093/qopen/qoaf004

Baba, A., Cook, D. M., McGarity, T. O., & Bero, L. A. (2005). Legislating “sound science”: The role of the tobacco industry. American Journal of Public Health, 95(S1), 20–27. https://doi.org/10.2105/AJPH.2004.050963

Beall, J. (2017). What I learned from predatory publishers. Biochemia medica, 27(2), 273–278. https://doi.org/10.11613/BM.2017.029

Berkeley Initiative for Transparency in the Social Sciences. (2020). Guide for accelerating computational reproducibility in the social sciences. Retrieved August 1, 2025, from https://bitss.github.io/ACRE

Brecht, B. (1974). Life of Galileo. Methuen Publishing.

Brembs, B. (2018). Prestigious science journals struggle to reach even average reliability. Frontiers in human neuroscience, 12, 327726. https://doi.org/10.3389/fnhum.2018.00037

Center for Open Science. (2025). COS statement on “restoring gold standard science” executive order. Center for Open Science. Retrieved October 8, 2025, from https://web.archive.org/web/20250818092037/https://www.cos.io/about/news/cos-statement-on-restoring-gold-standard-science-executive-order

Clarke, B., Lee, P. Y., Schiavone, S. R., Rhemtulla, M., & Vazire, S. (2023). The prevalence of direct replication articles in top-ranking psychology journals. https://doi.org/10.1037/amp0001385

Grossmann, A., & Brembs, B. (2021). Current market rates for scholarly publishing services. F1000Research, 10, 20. https://doi.org/10.12688/f1000research.27468.2

Grunow, M., Schneider, H., Wagner, G. G., & Wagner, J. (2018). Lack of reproducibility is seriously undermining the credibility of science as a whole [editorial]. International Journal for Re-Views in Empirical Economics, 2(2018-2), 1–6. https://doi.org/10.18718/81781.6

Haustein, S., Schares, E., Alperin, J. P., Hare, M., Butler, L. -A., & Schönfelder, N. (2024). Estimating global article processing charges paid to six publishers for open access between 2019 and 2023. https://arxiv.org/abs/2407.16551

Heyard, R., Pawel, S., Frese, J., Voelkl, B., Würbel, H., McCann, S., et al. (2025). A scoping review on metrics to quantify reproducibility: A multitude of questions leads to a multitude of metrics. Royal Society Open Science, 12(7), 242076. https://doi.org/10.1098/rsos.242076

Holcombe, A. O. (2019). Contributorship, not authorship: Use credit to indicate who did what. Publications, 7(3), 48. https://doi.org/10.3390/publications7030048

King, G. (1995). Replication, replication. PS: Political Science & Politics, 28(3), 444–452. https://doi.org/10.2307/420301

Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in psychology research: How often do they really occur? Perspectives on Psychological Science, 7(6), 537–542. https://doi.org/10.1177/1745691612460688

Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Percie du Sert, N., & Ioannidis, J. P. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021

Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716

Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., & Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71

Perry, T., Morris, R., & Lea, R. (2022). A decade of replication study in education? a mapping review (2011--2020). Educational Research and Evaluation, 27(1--2), 12–34. https://doi.org/10.1080/13803611.2021.2022315

Pridemore, W. A., Makel, M. C., & Plucker, J. A. (2018). Replication in criminology and the social sciences. Annual Review of Criminology, 1, 19–38. https://doi.org/10.1146/annurev-criminol-032317-091849

Reed, W. R., Röseler, L., Saam, M., & Wallrich, L. (2025). No room at the inn? the case for specialized replication journals (tech. rep. No. 88). WiSo-HH Working Paper Series. https://www.econstor.eu/handle/10419/319083

Röseler, L., Hein, M., & Oppong Boakye, P. (2025). Standardized reproduction and replication templates (start). https://doi.org/10.17605/OSF.IO/BRXTD

Röseler, L., Moser, V., Schüller, S., & Förster, C. (2025). Replication research symposium: Book of abstracts. replication research symposium (R2S), Münster, Germany. https://doi.org/10.5281/zenodo.15487804

Röseler, L., Wallrich, L., Adler, S., Evans, T. R., Goltermann, J., Gut, U., Korbmacher, M., Oppong Boakye, P., Verheyen, S., Visser, I., & Azevedo, F. (2025). Replication research constitution (R2 launch). https://doi.org/10.5281/zenodo.17279413

Röseler, L., Wallrich, L., & Azevedo, F. (2025). Conditions under which R2 will be discontinued after a runtime of 26 months. https://doi.org/10.5281/zenodo.17206100

Röseler, L., Wallrich, L., Hartmann, H., Hüffmeier, J., Goltermann, J., Pennington, C. R., Boyce, V., Field, S. M., Pittelkow, M.-M., Silverstein, P., van Ravenzwaaij, D., & Azevedo, F. (2025). Handbook for reproduction and replication studies [*shared first authorship]. https://doi.org/10.5281/zenodo.16990114

Röseler, L., Kaiser, L., Doetsch, C., Klett, N., Seida, C., Schütz, A., Aczel, B., Adelina, N., Agostini, V., Alarie, S., et al. (2024). The replication database: Documenting the replicability of psychological science. Journal of Open Psychology Data, 12, 8. https://doi.org/10.5334/jopd.101

Sauvé, S. A., Middleton, S. L., Gellersen, H., & Azevedo, F. (2025). In pursuit of citational justice: A toolkit for equitable scholarship. https://doi.org/10.31222/osf.io/qjecy_v3

Schöch, C. (2023). Repetitive research: A conceptual space and terminology of replication, reproduction, revision, reanalysis, reinvestigation and reuse in digital humanities. International Journal of Digital Humanities, 5(2—3), 373–403. https://doi.org/10.1007/s42803-023-00073-y

UNESCO. (2021). UNESCO recommendation on open science (tech. rep.). https://doi.org/10.54677/MNMH8546

Downloads

Published

2025-10-20

How to Cite

Röseler, L., Wallrich, L., Adler, S., Oppong Boakye, P., Evans, T. R., Goltermann, J., Haven, T., Horstmann, J., Korbmacher, M., Müller, M., Verheyen, S., Visser, I., & Azevedo, F. (2025). A Community Model for Rigorous and Inclusive Scholarship: Inaugural Editorial of Replication Research (R2). Replication Research, 1. https://doi.org/10.17879/replicationresearch-2025-9022

Issue

Section

Editorial Communication

Categories