Conceptual framework
The knowledge production modes (KPM) framework was developed to evaluate the relevance and feasibility of reproducibility as a research practice and a criterion of research quality in diverse settings. Relevance is shaped by a community’s epistemology, methodological standards, and any proprietary or commercial constraints. Feasibility depends on the subject matter, available resources, the type of reproducibility pursued, and the level of uncertainty associated with theoretical and methodological conditions.
-
Knowledge Production Modes: The Relevance and Feasibility of Reproducibility https://doi.org/10.31222/osf.io/ujnd9
-
A conceptual review of uses and meanings of reproducibility and replication: https://osf.io/preprints/metaarxiv/entu4_v3
Evidence synthesis
Findings from a few publications indicate that reproducibility is hindered by the absence of data and code, inadequate documentation, unstable computational setups, and, in qualitative research, a mismatch with standard expectations for reproducibility. They also identify drivers such as growing awareness of reproducibility issues, increasing use of open science standards, and strong documentation and ethical data practices. Together, these insights support the need for a threaded publication model that links methods, data, code, and contextual materials to address these barriers.
-
Reproducibility in machine-learning-based research: Overview, barriers, and drivers: https://onlinelibrary.wiley.com/doi/10.1002/aaai.70002
-
Reproducibility and replicability of qualitative research: an integrative review of concepts, barriers and enablers: https://osf.io/preprints/metaarxiv/n5zkw_v1
- Open science interventions to improve reproducibility and replicability of research: a scoping review: https://royalsocietypublishing.org/doi/10.1098/rsos.242057
To gain a deeper understanding of the causes, consequences, and potential solutions for perceived low reproducibility in research across different contexts, TIER2 focused on the social, life, and computer sciences, as well as research publishers and funders. The project aimed to raise awareness, build capacity, and propose innovative solutions tailored to diverse research cultures. Central to this effort were eight pilot activities designed to develop, implement, and evaluate new reproducibility-related tools and practices, with a strong emphasis on stakeholder engagement and collaboration throughout the project.
-
D4.3 Pilot implementation reflection report including assessment of efficacy & recommendations for future developments: https://osf.io/7e6dy
-
D5.1 Reproducibility toolset (tools & practices) for researchers: https://osf.io/5nqh6/overview
-
D5.2 Reproducibility toolset (tools & practices) for publishers: https://osf.io/s7gjv/overview
- D5.3 Reproducibility toolset (tools & practices) for funders: https://osf.io/pfjth/overview
The TIER2 Reproducibility Hub (ReproHub), hosted at the Embassy of Good Science, is a resource dedicated to strengthening trust, transparency, and efficiency in research. It brings together outputs from the TIER2 project, including definitions, stakeholder insights, and tools from eight pilot activities, and connects users with related initiatives to foster cross-disciplinary collaboration. The Hub provides guidance for building reproducibility networks and awards, and is supported by policy briefs and studies outlining stakeholders’ visions.
A range of co-creation formats was employed to actively engage the TIER2 stakeholder groups at various stages of the development, implementation, and evaluation of new reproducibility-related tools and practices, emphasising stakeholder engagement and collaboration.
The TIER2 Award was established to support the creation of three new Reproducibility Networks in Horizon Europe “Widening Participation” countries. Following the initial open call in 2023, two consortia in Georgia and Ukraine received awards. To further strengthen efforts in the region, a third award was granted to a consortium in Serbia in 2024.
The Reproducibility Training modules are free courses designed for researchers, publishers, and practitioners committed to enhancing research integrity. They combine theoretical knowledge with practical guidance, covering topics such as principles of reproducibility, open science practices, methodological and epistemological considerations, operational checks, and tools to enhance transparency, reliability, and trustworthiness across different research contexts.
The stakeholder roadmap outlines priorities for improving research reproducibility. It includes practical policy recommendations, guidelines, and briefs for funders, institutions, policymakers, administrators, integrity officers, publishers, and researchers in Europe and beyond. The policy briefs focus on promoting reproducibility-sensitive policymaking that accounts for epistemic diversity across research fields, as well as highlighting the need for stronger incentives, open practices, and dedicated support to enhance the transparency and trustworthiness of AI research.
- D2.6 Policy Briefing 2 will be available in February 2026.
- D3.2 Validated key impact pathways for reproducibility, including recommendations, will be available in February 2026.
Policy briefs
-
Reproducibility and Epistemic Diversity: https://osf.io/va45y
-
Enhancing Research Reproducibility: TIER2’s Contributions to the European Open Science Cloud (EOSC): https://osf.io/jbqe5
-
Open Science for Artificial Intelligence: Implementing Reproducibility to Promote Trust in AI: [link]
-
Policy brief on qualitative research: [link]
-
Policy brief (recommendations): [link]