How do we measure success for Open Research?




To deliver the benefits of Open Research, increasing adoption of Open Research practices is a prerequisite. There is a growing need for reliable data on the extent to which Open Research (or Open Science) practices are being adopted: to better understand researchers’, support institutional planning, and to understand the effectiveness of interventions that are intended to change research culture and practice. The 2021 UNESCO recommendation on Open Science has necessitated the establishment of a global Open Science monitoring framework, and progress has in parallel been made at the national level in France, with the French Open Science Monitor. Further, research institutions in the UK will begin piloting, in 2024, new ways to monitor indicators of open research (science). Scholarly publishers and funding agencies with progressive policies on open science, beyond open access to publications, are also establishing monitoring approaches. The non-profit open access publisher PLOS for example launched ‘Open Science Indicators’, (OSI) an initiative that tracks adoption of Open Science practices – such as sharing data, code, protocols and preprints – over time in the scholarly literature supported by artificial intelligence. These OSIs were reused in the UK Committee on Research Integrity’s (UKCORI) 2023 report on research integrity in the UK, as a way to benchmark aspects of the transparency of UK research. PLOS, DataSeer and other solution providers are collaborating with the UK Reproducibility Network (UKRN) as part of their Open Research Indicators pilots. This poster demonstrates growing activity in the creation of new measures of research integrity and transparency from different stakeholders – meta-researchers, policy makers, research institutions, technology providers and publishers – and showcase examples. This poster also foregrounds the importance of establishing common principles for use and development of these new metrics / indicators. Such as pragmatism; transparency; reproducibility; efficiency; interoperability; aligning with community standards; responsible use; and acknowledging the limitations of quantitative metrics.