Research

Comparison of original and replication findings
© Lukas Röseler

Replication Research in Cooperation with FORRT

Background: The scientific record is biased towards exciting and surprising results. Replications, that is, repeated tests of published findings, have been receiving little attention. In cooperation with the Framework for Open and Reproducible Research Training (FORRT) and together a large international and interdisciplinary team of researchers, we have created the world's largest database of scientific replication studies. The findings are being analyzed live and are openly accessible. Researchers can contribute new findings or use the database for their own questions.

Central findings: We have been using the database to esimate replication success and found a replication rate of 50%. That is, only 5 out of 10 replication studies arrive at the same conclusions as the original studies did. 

Further information: All project materials are available online.

FORRT Replication Database
Framework for Open and Reproducible Research Training

FORRT logo
© https://forrt.org (CC BY NC SA 4.0)
Preregistered analysis script
© Lukas Röseler

Preregistration of research

Preregistrations can aid researchers in eliminating degrees of freedoms but also help them document non-teleological processes (e.g., exploratory or inductive research). We are investigating how definitions and implementations of preregistration templates differ between natural, social, cognitive sciences and humanities and perspectives (i.e., confirmatory vs. exploratory). The aim is to provide an interdisciplinary manual for how to preregister research.

Comparison of published and unpublished findings
© Lukas Röseler

Innovative Publication Models

Failure — that is, disconfirmed hypotheses, unfit theories, problematic methods — is not only part of everyday scientific practice but also an essential part of scientific progress. Still, this type of research receives little attention in many fields (File-Drawer Problem). We are discussing theoretical and empirical reasons for this bias and shed light on an interdisciplinary spectrum of trial and error.

Peer Community In

MüCOS supports the Peer Community In (PCI) initiative and is part of the PCI network. PCI is a non-profit organization run by scientists that offers an alternative and high-quality peer review process that works independently of commercial publishers. You can find more information in an explanatory video. PCIs currently exist for over 15 scientific disciplines and further PCIs are currently under development:
Overview of PCIs by scientific discipline

PCI Process
© PCI (https://peercommunityin.org/current-pcis/)
Funnel plot from a dynamical meta-analysis
© Lukas Röseler

Mega-Science / Dynamic Meta-Analyses / Cumulative Science

Several scientific disciplines have revised their understanding of cumulative science in a way that allows for aggregation of knowledge on the level of collected data instead of published results. We are developing a taxonomy for cumulative methods, develop software, and collect examples of how scientific evidence can be aggregated systematically and dynamically.

Transparency and Trust
© Lukas Röseler

In the wake of the replicability crisis and the resulting open science approaches, the transparency and trustworthiness of scientific results has become a central task of scientific activity, but also of meta-science: Transparency creates trust through increased verifiability, but at the same time it can also lead to uncertainty and mistrust and be used for the targeted misuse of scientific results. In view of increased tendencies towards scientific skepticism and even hostility towards science among the population, for example in the context of the COVID-19 pandemic or the climate crisis, we combine research on the transparency and replicability of scientific findings with science communication analyses on trust in science.

 

Publications

  • Adler, S. J., Röseler, L., & Schöniger, M. K. (2023). A toolbox to evaluate the trustworthiness of published findings. Journal of Business Research, 167, 114189. https://doi.org/10.1016/j.jbusres.2023.114189
  • Back, M. D. (2019). Editorial: Increasing scientific quality in the expanding field of personality science. European Journal of Personality, 33, 3-6. https://doi.org/10.1002/per.2192 pdf
  • Back, M. D. (2020). Editorial: A brief wish list for personality research. European Journal of Personality, 34, 3–7. https://doi.org/10.1002/per.2236 pdf
  • Geukes, K., Schönbrodt, F. D., Utesch, T. , Geukes, S., & Back, M. D. (2016). Wege aus der Vertrauenskrise – Individuelle Schritte hin zu verlässlicher und offener Forschung [Ways out of the crisis of confidence – Individual steps towards reliable and open science]. Zeitschrift für Sportpsychologie, 23, 99-109. https://doi.org/10.1026/1612-5010/a000167 pdf
  • Landy, J. F., Jia, M. L., Ding, I. L., Viganola, D., Tierney, W., Dreber, A., Johannesson, M., Pfeiffer, T., Ebersole, C. R., Gronau, Q. F., Ly, A., van den Bergh, D., Marsman, M., Derks, K., Wagenmakers, E.-J., Proctor, A., Bartels, D. M., Bauman, C. W., Brady, W. J., . . . Röseler, L., . . . Uhlmann, E. L. (2020). Crowdsourcing hypothesis tests: Making transparent how design choices shape research results. Psychological Bulletin, 146(5), 451–479. https://doi.org/10.1037/bul0000220
  • Röseler, L., & Schütz, A. (2022). Open Science. In A. Schütz, M. Brand, S. Steins-Loeber, M. Baumann, J. Born, V. Brandstätter, C.-C. Carbon, P. M. Gollwitzer, M. Hallschmid, S. Lautenbacher, L. Laux, B. Marcus, K. Moser, K. I. Paul, H. Plessner, F. Renkewitz, K.-H. Renner, K. Rentzsch, K. Rothermund, . . . S. Steins-Löber (Eds.), Psychologie: Eine Einführung in ihre Grundlagen und Anwendungsfelder (6., überarbeitete Auflage, pp. 187–198). Kohlhammer.
  • Röseler, L., Weber, L., Helgerth, K., Stich, E., Günther, M., Tegethoff, P., Wagner, F., Antunovic, M., Barrera-Lemarchand, F., Halali, E., Ioannidis, K., Genschow, O., Milstein, N., Molden, D. C., Papenmeier, F., Pavlovic, Z., Rinn, R., Schreiter, M. L., Zimdahl, M. F., . . . Schütz, A. (2022). The Open Anchoring Quest Dataset: Anchored Estimates from 96 Studies on Anchoring Effects. Journal of Open Psychology Data, 10(1), 16. https://doi.org/10.5334/jopd.67
  • Tierney, W., Hardy, J., Ebersole, C. R., Viganola, D., Clemente, E. G., Gordon, M., Hoogeveen, S., Haaf, J., Dreber, A., Johannesson, M., Pfeiffer, T., Huang, J. L., Vaughn, L. A., DeMarree, K., Igou, E. R., Chapman, H., Gantman, A., Vanaman, M., Wylie, J., . . . Röseler, L. . . . Uhlmann, E. L. (2021). A creative destruction approach to replication: Implicit work and sex morality across cultures. Journal of Experimental Social Psychology, 93, 104060. https://doi.org/10.1016/j.jesp.2020.104060
  • Wunsch, K., Pixa, N. H., & Utesch, K. (2023). Open Science in German Sport Psychology: State of the art and future directions. Zeitschrift für Sportpsychologie, 30(4), 156-166. https://doi.org/10.1026/1612-5010/a000406 pdf

Pre-Prints

  • Röseler, L., Kaiser, L., Doetsch, C. A., Klett, N., Seida, C., Schütz, A., … Zhang, Y., Mr. (2024, April 11). The Replication Database: Documenting the Replicability of Psychological Science. https://doi.org/10.31222/osf.io/me2ub
  • Röseler, L. (2023). Predicting Replication Rates with Z-Curve: A Brief Exploratory Validation Study Using the Replication Database. https://doi.org/10.31222/osf.io/ewb2t
  • Röseler, L., Carbon, C. C., & Schütz, A. (2023). Stellungnahme zum Positionspapier der Psychologie-Fachschaften-Konferenz (PsyFaKo e.V.) zum Thema Autor*innenschaft. Advance online publication. https://doi.org/10.31234/osf.io/cbkzp
  • Röseler, L., Gendlina, T., Krapp, J., Labusch, N., & Schütz, A. (2022). Successes and Failures of Replications: A Meta-Analysis of Independent Replication Studies Based on the OSF Registries. Advance online publication. https://doi.org/10.31222/osf.io/8psw2
  • Röseler, L., Kaiser, L., Doetsch, C., Klett, N., Krapp, J., Seida, C., Schütz, A., Barth, C., Cummins, J., Dienlin, T., Elsherif, M., Förster, N., Genschow, O., Gnambs, T., Hartmann, H., Hilbert, L., Holgado, D., Hussey, I., Korbmacher, M., Kulke, L., Liu, Y., Lohkamp, F., Lou, N., Oomen, D., Papenmeier, F., Paruzel-Czachura, M., Pavlov, Y., Pavlović, Z., Pypno, K., Rausch, M., Rebholz, T., Ross, R., Thürmer, L., Vaughn, L. (2023). ReD: Replication Database, Version 0.4.0. https://dx.doi.org/10.17605/OSF.IO/9r62x