<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">
<article article-type="research-article" xml:lang="EN" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">LIBER</journal-id>
<journal-title-group>
<journal-title>LIBER QUARTERLY</journal-title>
</journal-title-group>
<issn pub-type="epub">2213-056X</issn>
<publisher>
<publisher-name>openjournals.nl</publisher-name>
<publisher-loc>The Hague, The Netherlands</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">lq.14947</article-id>
<article-id pub-id-type="doi">10.53377/lq.14947</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Emerging Roles and Responsibilities of Libraries in Support of Reproducible Research</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<contrib-id contrib-id-type="orcid">https://orcid.org/0000-0001-8036-5859</contrib-id>
<name>
<surname>Schmidt</surname>
<given-names>Birgit</given-names>
</name>
<email>bschmidt@sub.uni-goettingen.de</email>
<xref ref-type="aff" rid="aff1"/>
</contrib>
<contrib contrib-type="author">
<contrib-id contrib-id-type="orcid">https://orcid.org/0000-0001-7336-8330</contrib-id>
<name>
<surname>Chiarelli</surname>
<given-names>Andrea</given-names>
</name>
<email>andrea.chiarelli@research-consulting.com</email>
<xref ref-type="aff" rid="aff2"/>
</contrib>
<contrib contrib-type="author">
<contrib-id contrib-id-type="orcid">https://orcid.org/0000-0002-3548-3124</contrib-id>
<name>
<surname>Loffreda</surname>
<given-names>Lucia</given-names>
</name>
<email>lucia.loffreda@research-consulting.com</email>
<xref ref-type="aff" rid="aff2"/>
</contrib>
<contrib contrib-type="author">
<contrib-id contrib-id-type="orcid">https://orcid.org/0000-0002-9866-0239</contrib-id>
<name>
<surname>Sondervan</surname>
<given-names>Jeroen</given-names>
</name>
<email>j.sondervan@nwo.nl</email>
<xref ref-type="aff" rid="aff3"/>
<xref ref-type="aff" rid="aff4"/>
</contrib>
<aff id="aff1">State and University Library, University of G&#x00F6;ttingen, G&#x00F6;ttingen, Germany</aff>
<aff id="aff2">Research Consulting, Nottingham, UK</aff>
<aff id="aff3">Utrecht University, Utrecht</aff>
<aff id="aff4">Dutch Research Council (NWO), The Hague, The Netherlands</aff>
</contrib-group>
<pub-date pub-type="epub">
<month>12</month>
<year>2023</year>
</pub-date>
<volume>33</volume>
<fpage>1</fpage>
<lpage>21</lpage>
<permissions>
<copyright-statement>Copyright 2023, The copyright of this article remains with the author</copyright-statement>
<copyright-year>2023</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See <uri xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</uri>.</license-p>
</license>
</permissions>
<self-uri xlink:href="https://www.liberquarterly.eu/article/10.53377/lq.14947"/>
<abstract>
<p>Ensuring the reproducibility of research is a multi-stakeholder effort that comes with challenges and opportunities for individual researchers and research communities, librarians, publishers, funders and service providers. These emerge at various steps of the research process, and, in particular, at the publication stage. Previous work by Knowledge Exchange highlighted that, while there is growing awareness among researchers, reproducible publication practices have been slow to change. Importantly, research reproducibility has not yet reached institutional agendas: this work seeks to highlight the rationale for libraries to initiate and/or step up their engagement with this topic, which we argue is well aligned with their core values and strategic priorities. We draw on secondary analysis of data gathered by Knowledge Exchange, focusing on the literature identified as well as interviews held with librarians. We extend this through further investigation of the literature and by integrating the findings of discussions held at the 2022 LIBER conference, to provide an updated picture of how libraries engage with research reproducibility. Libraries have a significant role in promoting responsible research practices, including transparency and reproducibility, by leveraging their connections to academic communities and collaborating with stakeholders like research funders and publishers. Our recommendations for libraries include: i) partnering with researchers to promote a research culture that values transparency and reproducibility, ii) enhancing existing research infrastructure and support; and iii) investing in raising awareness and developing skills and capacities related to these principles.</p>
</abstract>
<kwd-group>
<kwd>research reproducibility</kwd>
<kwd>open science</kwd>
<kwd>publishing</kwd>
<kwd>scholarly communication</kwd>
<kwd>research data management</kwd>
<kwd>FAIR</kwd>
<kwd>code and software</kwd>
<kwd>research support</kwd>
<kwd>research libraries</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>1. Introduction</title>
<p>On the one hand, research libraries are a stable entity when it comes to reliable information management for their institution and broader research communities. On the other hand, they are undergoing a constant transformation based on their changing environments as well as related roles and responsibilities for example in the area of open research. Based on continually emerging opportunities and challenges, libraries adopt and drive new or enhanced information practices. In recent years, the extent to which research findings can be trusted has been challenged based on the inability to reproduce or replicate research findings (the so-called &#x201C;reproducibility crisis&#x201D;),<xref ref-type="fn" rid="fn1"><sup>1</sup></xref> which is often due to limited or no access to research data, lack of documentation in methodologies and/or limited or no availability of the software and code used to run an analysis. These issues, which are often considered as part of research integrity (<xref ref-type="bibr" rid="r9">Diaba-Offeh, 2021</xref>), apply to published research, and can severely affect a researcher&#x2019;s career, ranging from minor errors to severe ones leading to retractions or even investigations of research misconduct.</p>
<p>We acknowledge that there is no universally agreed definition of &#x201C;research reproducibility&#x201D;, that is the related terms &#x201C;replication&#x201D;, &#x201C;reproduction&#x201D; and &#x201C;repetition&#x201D; are sometimes used interchangeably or as distinct concepts, depending on context and research discipline (<xref ref-type="bibr" rid="r39">Stanford Encyclopedia of Philosophy, 2018</xref>; <xref ref-type="bibr" rid="r25">National Academies of Sciences, Engineering, and Medicine [NASEM], 2018</xref>, <xref ref-type="bibr" rid="r26">2019</xref>). For the purpose of this paper, we will refer to reproducibility as the ability to reproduce the findings of a study based on access to the original data and methods/procedures. This usually requires a detailed description of the methods used to process and analyse the data, access to any relevant datasets (and related documentation) and an ability to obtain and run computer code, where appropriate. It must be noted that definitions of reproducibility often focus on &#x201C;computational reproducibility&#x201D;. Findings based on qualitative methodologies (e.g. literature reviews, interviews, manual thematic analysis) will often not be fully reproducible based on subjective elements in assessment, classification, selection, contextualisation or interpretation processes.</p>
<p>Between 2020 and 2021, Knowledge Exchange (KE) investigated the publication of reproducible research outputs, with support from a group of international open science experts. The KE partners are six key national organisations within Europe tasked with developing infrastructure and services to enable the use of digital technologies to improve higher education and research: IT Center for Science (CSC) in Finland, Centre national de la recherche scientifique (CNRS) in France, the Danish e-Infrastructure Consortium (DeiC) in Denmark, the German Research Foundation (DFG) in Germany, Jisc in the UK and SURF in The Netherlands. The study was initiated by KE&#x2019;s Open Access Expert Group, which focuses on the ongoing transformation of the scholarly publishing landscape.<xref ref-type="fn" rid="fn2"><sup>2</sup></xref> One core finding was that ensuring the reproducibility of research is a multi-stakeholder effort, with challenges and opportunities for individual researchers and research communities, librarians, publishers, funders and service providers. These emerge at various steps of the research cycle, and, in particular, at the publication stage.</p>
<p>The output of this work was a public report (<xref ref-type="bibr" rid="r4">Chiarelli et al., 2021a</xref>, <xref ref-type="bibr" rid="r5">2021b</xref>), which was based on an extensive literature review of almost 130 sources and a mix of interviews and focus groups with 51 stakeholders from 12 countries. In this landscape study, stakeholders were considered at the micro, meso and macro level based on the <italic>KE Open Scholarship Framework</italic> (<xref ref-type="bibr" rid="r28">Neylon et al., 2019</xref>, cf. <xref ref-type="fig" rid="fg001">Figure 1</xref>).</p>
<fig id="fg001">
<label>Fig. 1:</label>
<caption><p>Knowledge Exchange Open Scholarship Framework.</p></caption>
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="figures/LIBER_2023_33_Schmidt_fig1.jpg"/>
</fig>
<p>Across its levels, the Open Scholarship Framework considers researchers and research groups, their disciplinary communities and institutions as the content creators (micro level); academic libraries and their institutions as well as publishers and infrastructure providers in supporting and strategic roles (meso level); and research funders and policymakers as those who influence the broader ecosystem and establish recommendations and requirements (macro level).</p>
<p>In this context, the exact role that librarians should or could play remains unclear. For example, why should librarians care about the reproducibility of (published) research findings? Through what mechanisms could they enable and foster best practices in research and publishing, leveraging the services and activities that they manage in collaboration with other stakeholders? In this paper, we have sought to translate the results of the KE study into practical tools for research libraries.</p>
<p>Our paper begins with a background section that provides context and foundational information on research reproducibility, including with regard to the role of libraries. This is followed by an analysis of stakeholder roles and key touchpoints for librarians, enriched with relevant quotes. The paper ends with a summary of our findings and a series of recommendations to enhance the role of research libraries in the reproducibility landscape.</p>
</sec>
<sec id="s2">
<title>2. Background</title>
<p>Libraries&#x2019; engagement with reproducibility as an activity and set of skills has multiple perspectives: 1) From the perspective of <italic>collections management</italic>, libraries select, manage, curate and ensure the discoverability of the published record of research outputs. In the academic and societal environment in which libraries operate, challenges such as the &#x201C;reproducibility or replication crisis&#x201D; can affect the trust in these collections. 2) In the context of <italic>research support</italic>, libraries directly engage with researchers by providing support and infrastructures for research data management as well as training in data (science) literacy and its relationship to open science (e.g. FAIR and open data) (<xref ref-type="bibr" rid="r21">Lyon, 2016</xref>; <xref ref-type="bibr" rid="r34">Quan, 2021</xref>). Building up services for the management and curation of research software is a relatively new yet emerging responsibility for libraries (<xref ref-type="bibr" rid="r3">Chassanoff et al., 2018</xref>; <xref ref-type="bibr" rid="r23">Martinez-Ortiz et al., 2023</xref>). 3) Institutional <italic>publishing services</italic> develop publication policies, workflows and infrastructures and provide support for transparency and openness (e.g. <xref ref-type="bibr" rid="r29">Nosek et al., 2023</xref>). This includes guidance on data publishing and citation, e.g. via own or external repositories, availability statements and licensing. 4) Across all these service areas, libraries contribute to awareness raising, training and implementation support for <italic>responsible research practices</italic> at various stages of the research and scholarly communication cycle as a safeguard for reliable research.</p>
<p>Over the last few years, the research reproducibility landscape has been rapidly evolving, as discussions around open access, FAIR and open data, research integrity, open source and open infrastructures continue to converge. More specifically, it has been pointed out that open science provides a means to improve the quality, transparency and reproducibility of research (<xref ref-type="bibr" rid="r24">Munaf&#x00F2; et al., 2017</xref>) leading to more accountability of research. This also holds for the step of publishing research outcomes: &#x201C;executable research papers&#x201D; support reproducibility through a combination of text, raw data and code that (re)creates the analysis and figures, and offers readers means to interact with or manipulate the code (<xref ref-type="bibr" rid="r43">Tsang &#x0026; Maciocci, 2020</xref>). For researchers, taking up this approach for writing papers can turn out as an insightful learning experience and journey through open science (<xref ref-type="bibr" rid="r18">Lasser, 2020</xref>). The approach is however far from becoming the established norm as this would require a cultural shift of the scientific community towards making data and code openly available and a recognition of the skills and efforts needed to implement all necessary steps.</p>
<p>To support and foster reproducibility across the research cycle, many stakeholders including researchers, scholarly societies, funders and publishers have created guidelines and recommendations (e.g. <xref ref-type="bibr" rid="r16">KNAW, 2018</xref>; <xref ref-type="bibr" rid="r24">Munaf&#x00F2; et al., 2017</xref>; <xref ref-type="bibr" rid="r27">National Institutes of Health, 2014</xref>; <xref ref-type="bibr" rid="r29">Nosek et al., 2023</xref>; <xref ref-type="bibr" rid="r44">Wellcome Trust, 2017</xref>).<xref ref-type="fn" rid="fn3"><sup>3</sup></xref> Moreover, collaborative initiatives led by researchers have formulated documentation standards in support of reproduction (e.g. Project TIER Protocol),<xref ref-type="fn" rid="fn4"><sup>4</sup></xref> hands-on guides (e.g. Turing Way Handbook<xref ref-type="fn" rid="fn5"><sup>5</sup></xref> (<xref ref-type="bibr" rid="r42">The Turing Way Community, 2022</xref>)), reviews of existing infrastructures (e.g. <xref ref-type="bibr" rid="r15">Konkol et al., 2020</xref>), assessed the reproducibility of research results in funding programmes (e.g. <xref ref-type="bibr" rid="r11">European Commission et al., 2022a</xref>), and are co-creating activities and policy recommendations (e.g. <xref ref-type="bibr" rid="r37">Ross-Hellauer et al., 2022</xref>). These documents provide guidance for researchers but also point out infrastructures and tools as well as support services that are provided by a range of stakeholders. From a conceptual perspective <xref ref-type="bibr" rid="r21">Lyon (2016)</xref> proposed a 3-dimensional model or spectrum for open science, which includes the 3rd axis of transparency. In this model librarians act as &#x2018;transparency agents&#x2019; that advocate, promote and demonstrate particular behaviours and good practices throughout the research lifecycle, which over time will lead to cultural change towards a more open science environment. <xref ref-type="bibr" rid="r38">Sayre and Riegelman (2019)</xref> compared a selected set of reproducibility guidelines and have mapped libraries&#x2019; expertise and services to key themes of these guidelines. <xref ref-type="bibr" rid="r17">LaPolla et al. (2022)</xref> investigated motivations, approaches and lessons learned from libraries&#x2019; engagement &#x2013; typically in collaboration with other departments &#x2013; in teaching rigor and reproducibility in the health sciences.</p>
<p>When it comes to reproducibility, research libraries, publishers and other stakeholders need to navigate a web of interconnected roles and responsibilities, which includes not only the advice and support they traditionally provide, but also an ever-broadening range of tools and workflows that span from choosing a repository for the deposit of data and code to preparing research compendia<xref ref-type="fn" rid="fn6"><sup>6</sup></xref> or executable research articles (<xref ref-type="bibr" rid="r32">Packer, 2020</xref>). Positively, research libraries have taken up innovative new roles, such as the &#x201C;data librarians/stewards&#x201D; that are becoming more common across Europe &#x2013; in particular in the context of implementing the FAIR data principles (<xref ref-type="bibr" rid="r45">Wilkinson et al., 2016</xref>) &#x2013; and the &#x201C;reproducibility librarians&#x201D; that are emerging in the USA and operate at the interface between the various facets of open science (<xref ref-type="bibr" rid="r40">Steeves, 2017</xref>).</p>
<p>The role of libraries and research support services with regard to reproducibility is also changing in terms of digital infrastructures. Researchers require more and more support in navigating the fast-paced world of technological solutions emerging to support open science practices, thus leading to uncertainty and a constant need to upskill both the research base and support staff. Whether internally or in partnership with external providers, research libraries also continue to develop institutional and thematic repositories, which can be interconnected with a web of digital objects to enable reproducible research to take place. The European Open Science Cloud (EOSC) is an example of ongoing efforts to build a trusted distributed digital research infrastructure that supports verifiable and reproducible research (<xref ref-type="bibr" rid="r12">European Commission, 2022b</xref>).</p>
<p>Finally, the broader context of the COVID-19 pandemic has highlighted the role of research reproducibility in enabling trust in the published record. For example, it was found that rapidly emerging preprints tackling important public health questions didn&#x2019;t always meet high transparency criteria and could improve their data sharing and scientific reporting practices (<xref ref-type="bibr" rid="r41">Sumner et al., 2020</xref>). Research libraries can play a dual role in this regard, as they can be both the custodians of research objects and advisers to local academics who seek to work in a reproducible way and enhance the quality of their institution&#x2019;s publications.</p>
</sec>
<sec id="s3">
<title>3. Methods and Data</title>
<p>This work presents a secondary analysis of the findings that were gathered during the 2020&#x2013;2021 KE exercise on the Art of Publishing Reproducible Research Outputs.<xref ref-type="fn" rid="fn7"><sup>7</sup></xref> This study included a literature review, stakeholder consultation and was presented at the 2022 LIBER conference.</p>
<p>To provide an updated view of the reproducibility landscape, we have built on the literature review conducted as part of the KE exercise as a starting point. This literature review included almost 130 literature sources from across academic disciplines and multidisciplinary sources and provided an overview of definitions, problems, recommendations and solutions or initiatives in the context of reproducibility (<xref ref-type="bibr" rid="r4">Chiarelli et al., 2021a</xref>; <xref ref-type="bibr" rid="r20">Loffreda &#x0026; Chiarelli, 2021</xref>). In the present study, we have revisited the literature findings to focus particularly on the key roles and responsibilities for research libraries. We have then conducted a brief review of literature to reflect any recent developments in the field of reproducibility that may affect or relate to these roles, such as the establishment of new initiatives and networks.</p>
<p>In addition to the updated review of literature, a secondary analysis of the interview transcripts gathered as part of the original exercise (<xref ref-type="bibr" rid="r5">Chiarelli et al., 2021b</xref>) has been conducted. Transcripts that have been reviewed include interviews with 41 international stakeholders, including research librarians, researchers, publishers and funders, and focus groups with a further 10 infrastructure providers. Text searches were conducted on these sources, including searches for &#x201C;library&#x201D; and &#x201C;steward&#x201D;, to identify examples of those in research support roles supporting the publication of reproducible research. Quotes from interviews are used throughout this paper to illustrate our key findings.</p>
<p>Finally, based on the above-mentioned methodology, we have considered how the findings of the updated landscape analysis align to LIBER&#x2019;s 2023&#x2013;2027 strategy (<xref ref-type="bibr" rid="r13">Frey et al., 2022</xref>). This approach has informed the development of the three main recommendations presented in the final section of this paper.</p>
</sec>
<sec id="s4">
<title>4. Limitations</title>
<p>We note that our updated literature and landscape analysis has been conducted during January and February 2023, thus representing a snapshot of the reproducibility landscape at a given point in time. It should also be noted that the original analysis has been underpinned by thematic coding. By nature, this relies on the subjective interpretation of the researcher.</p>
</sec>
<sec id="s5">
<title>5. Stakeholder Roles and Key Touchpoints for Librarians</title>
<p>Due to the multifaceted nature of research reproducibility, roles and responsibilities are inevitably complex and must be shared between different stakeholder groups at the aforementioned micro (researchers and disciplines), meso (institutions, libraries, publishers, software providers) and macro levels (research funders, policymakers). In particular, we note that cooperation is essential throughout the research lifecycle, as the publication of a reproducible research output requires a range of considerations from the inception of a research project all the way through its delivery and dissemination. One academic interviewee highlighted this point:</p>
<disp-quote>
<p><italic>&#x201C;I think the main thing is that these things need to be thought of early in the process, and if you start thinking about them too late it becomes harder to do them. So, in the context of data sharing for example, if you&#x2019;re collecting data from human participants you need to ask them for consent for data sharing, and if you leave it until after you&#x2019;ve collected the data you can&#x2019;t share the data as openly as if you have collected consent from people early on.&#x201D;</italic> (Academic interviewee 1)</p>
</disp-quote>
<p>One of the key challenges in research reproducibility is the need to share a broad range of research objects (as applicable) alongside one&#x2019;s outputs, including detailed methods, pre-registrations, research data, any relevant code/software and information about the software environment used. Notably, in listing these research objects we do not mean that reproducibility is inherently quantitative or computational in nature. It is, however, essential to acknowledge that the terminology we employed above may be misunderstood or misinterpreted in some disciplines, which highlights the importance of stakeholders speaking a shared language. For example, reproducibility guidance may be inappropriate at the funder or publisher level and much more helpful at the funding council or journal level, so as to add an extent of tailoring that will allow the target audience to understand any requirements. This sentiment was also expressed by another academic interviewed as part of the landscape study:</p>
<disp-quote>
<p><italic>&#x201C;I think a recognition of the fact that there&#x2019;s no single bullet, there will not be one system, or one set of practice that is going to rule everything, in that everybody will have to comply with. So that the recognition that it has to be, there will be different processes, different tools, and so it&#x2019;s essentially a training challenge, and innovation should come in the area of education.&#x201D;</italic> (Academic interviewee 2)</p>
</disp-quote>
<p>When it comes to the sharing of the above-mentioned research objects, it is clear that researchers play a key role: they are responsible for designing, delivering and disseminating the research and hold the information on relevant research objects (including metadata). At the same time, researchers can act as peer reviewers in cases where journals have reproducibility requirements, meaning that they may be responsible for checking that someone else&#x2019;s work is reproducible. In the case of reproducibility, it is also helpful to think of researchers as members of disciplinary communities: in most cases, reproducibility is unlikely to become a widespread concern unless the (sub)discipline establishes that this should be the case. For example, some disciplines show widespread data and code sharing (with implications on reproducibility), particularly where physical or digital infrastructure is shared (e.g. particle physics, astronomy), while others are currently not engaging with the concept. It is, however, recognised that reproducibility &#x201C;is relevant across scientific disciplines and cultures, be it in the humanities, engineering, life science, natural sciences, or social sciences&#x201D;, although projects in different disciplines may take &#x201C;demonstrably different approaches, varying from enhancing step-by-step reproducibility through code-based transparency to tracing the origin of an argument through publication lines&#x201D; (<xref ref-type="bibr" rid="r35">Rahal et. al., 2022</xref>). Furthermore, the role of individual reproducibility champions (including with the support of national reproducibility networks) in raising the profile of practices underpinning reproducible publication in disciplines where these remain uncommon is broadly recognised (<xref ref-type="bibr" rid="r5">Chiarelli et al., 2021b</xref>; <xref ref-type="bibr" rid="r37">Ross-Hellauer et al., 2022</xref>). This is where the potential for institutions to make a difference is highest, as fostering such efforts and providing related support is conducive to the broader uptake of new practices. As one academic interviewee highlighted:</p>
<disp-quote>
<p><italic>&#x201C;I think right now you have a few champions per community, which are driving efforts at different pace, in different spaces. So you have the people who are involved in producing tools&#x2026; there are people who have driven studies reproducing large numbers of results. So I think it&#x2019;s more a bunch of individuals and what we are trained to do is to find a way to get a forum to get some of those to communicate and inspire others.&#x201D;</italic> (Academic interviewee 3)</p>
</disp-quote>
<p>Arguably, libraries are uniquely placed to provide support around research reproducibility, as they are the institutional reference point for both authors and publishers and are typically well connected with other professional services such as IT, software engineers, ethics, data governance and data protection. Clearly, the remit of librarians varies based on national and institutional contexts, meaning that the provision around research reproducibility can take a range of different forms. For example, New York University and Oregon Health &#x0026; Science University provide dedicated research data management and reproducibility librarians,<xref ref-type="fn" rid="fn8"><sup>8</sup></xref> while at the Delft University of Technology reproducibility is within the remit of research data stewards (<xref ref-type="bibr" rid="r22">Martinez-Lavanchy et al., 2022</xref>). In other cases, reproducibility may be supported by a mix of roles, including research object curators or subject librarians, as was suggested by a research support librarian:</p>
<disp-quote>
<p><italic>&#x201C;I mean it might be a bit difficult because there are so many different disciplines and fields. I was thinking whether that would be somehow related to the library services, that&#x2019;s the natural place, where there could be someone who they could ask to help. Like if you want to have some kind of geometric study you would just go to your library staff and they could help you. It could be something similar with reproducible science.&#x201D;</italic> (Librarian interviewee 1)</p>
</disp-quote>
<p>A wide range of initiatives is being introduced as part of grassroots efforts to support research reproducibility, and these are often advertised and signposted within institutions by libraries in collaboration with researchers, IT and research services. These include community-initiatives such as ReproHacks,<xref ref-type="fn" rid="fn9"><sup>9</sup></xref> ReproducibiliTea,<xref ref-type="fn" rid="fn10"><sup>10</sup></xref> The Carpentries<xref ref-type="fn" rid="fn11"><sup>11</sup></xref> and the Turing Way Handbook<sup>5</sup> (<xref ref-type="bibr" rid="r42">The Turing Way Community, 2022</xref>) and the national Reproducibility Networks (e.g. UK, Germany, Sweden, Portugal)<xref ref-type="fn" rid="fn12"><sup>12</sup></xref> as well as local awareness raising and capacity building efforts (e.g. <xref ref-type="bibr" rid="r36">Rethlefsen et al., 2018</xref>). These initiatives are complemented by platforms, software and technical tools to support the management, sharing and execution of relevant research objects, and a list of examples is available in <xref ref-type="bibr" rid="r5">Chiarelli et al. (2021b)</xref>. At the high level, we note the importance of tools for reproducibility checking, software containerisation, code sharing and documentation, reproducible environments, research compendia and more.<xref ref-type="fn" rid="fn13"><sup>13</sup></xref> In most cases, individual researchers will not be familiar with these tools: given the great and continually increasing pressures on researchers&#x2019; time, it is essential for institutions (and, in most cases, libraries) to provide an extent of training and/or support around these systems if reproducibility is considered as a strategic priority. One institutional librarian noted:</p>
<disp-quote>
<p><italic>&#x201C;We do our best to get that information out, but there are still a lot of things to do I believe, to try to teach these skills also. Not only the tools and support, but also for them to be able to use these tools.&#x201D;</italic> (Librarian interviewee 2)</p>
</disp-quote>
<p>In practice, research libraries are only beginning to engage with reproducibility, including because this is not currently a priority in all disciplines and is lagging behind other portions of the open research agenda (e.g. open access, open data), partly due to the mix of complex practical requirements and expectations. This is where the role of research funders and government players becomes apparent. To date, minimal pressure has been perceived around reproducibility at the policy level, and institutions have consequently not seen this to be an area for strategic investment in most cases. This is not due to a lack of interest in reproducibility, but rather in acknowledgement that library (and, more broadly, institutional) budgets continue to be under growing pressure and areas of expenditure have to be prioritised. An interview with an institutional librarian touched on this topic, and they noted:</p>
<disp-quote>
<p><italic>&#x201C;That&#x2019;s [reproducibility] something we are actively trying to support but we get handcuffed in our own jobs because of lack of resources and lack of funding. People would rather put money into new science than preserving old science and that&#x2019;s also been a huge problem. So we talk a big game about reproducibility but no one&#x2019;s thinking in the long term.&#x201C;</italic> (Librarian interviewee 2)</p>
</disp-quote>
<p>As noted above, publishers in some disciplines are beginning to introduce reproducibility requirements, however, meaning that strategic pressures that might often arise at the macro level are actually emerging at the meso level and as part of micro-level grassroots movements led by researchers and disciplines</p>
</sec>
<sec id="s6">
<title>6. Conclusions</title>
<p>This paper set out to identify the rationale for libraries to engage with the reproducibility of (published) research findings and to outline a range of mechanisms and practices that would enable this. The discussion so far indicates that libraries have a natural role to play in raising awareness about responsible research practices &#x2013; including efforts that aim at increasing transparency and enhancing reproducibility of research &#x2013; and for providing support for their day-to-day implementation. For this, libraries can build on their good connections to academic communities and are well aware of disciplinary needs. To cover all steps of the research and scholarly communication cycle, collaboration with other stakeholders such as research funders and publishers is essential.</p>
<p>In sum, we would like to propose three recommendations for the the role of research libraries in the reproducibility landscape: librarians should team up with researchers (and other stakeholders) for the promotion of an (institutional) <italic>research culture that supports and rewards</italic> the principles of research transparency and reproducibility; build on and expand existing <italic>research infrastructure and support</italic>; and engage and invest in raising <italic>awareness and</italic> the creation of <italic>skills and capacities</italic>.</p>
<sec id="s6a">
<title>6.1. Institutional Research Culture</title>
<p>Transparency, openness and reproducibility are not just good practice principles but means to enhance the quality, integrity and effectiveness of research. To foster a research culture that encourages, acknowledges and rewards researchers&#x2019; efforts in this regard, librarians should step up their engagement and act as allies and partners in the ongoing reform of research assessment (<xref ref-type="bibr" rid="r6">CoARA, 2022</xref>; <xref ref-type="bibr" rid="r7">Council of the European Union, 2022</xref>, <xref ref-type="bibr" rid="r8">2023</xref>; <xref ref-type="bibr" rid="r10">European Commission, 2021</xref>). In particular, they can:</p>
<list list-type="bullet">
<list-item><p>Lead or contribute to the development of amendments of institutional research policies (e.g. with reference to funder policies, general or disciplinary good practice principles).</p></list-item>
<list-item><p>Engage in discussion on incentives and recognition of achievements with respect to transparency and reproducibility of research (e.g. for data reuse, evaluation criteria).</p></list-item>
</list>
</sec>
<sec id="s6b">
<title>6.2. Research and Publication Infrastructures, Guidance and Support</title>
<p>Research can be made more robust and reproducible through careful planning, management and documentation of workflows, covering all relevant inputs, processing steps, outcomes and interconnections. While there are multiple environments and tools for implementing such workflows, a crucial step is the integration with reliable research and publication infrastructures (e.g. data repositories, library-based publishing) that support the sharing, publication and archiving of research outcomes. In this context, librarians can:</p>
<list list-type="bullet">
<list-item><p>Build on existing expertise in terms of providing support, training and creating infrastructures for open access and research data management.</p></list-item>
<list-item><p>Preserve and share methodology, data, software and other inputs and outputs to enable research replication and to build public knowledge (<xref ref-type="bibr" rid="r19">Library Publishing Coalition Ethical Framework Task Force, 2023</xref>).</p></list-item>
<list-item><p>Invest in open infrastructures in the areas of FAIR data, code and software that deliver storage, tools, services and workflows for researchers to enhance their reproducibility. An essential element to take into account is the interoperability of such infrastructure.</p></list-item>
<list-item><p>Engage in the management and curation of research data and research software is key to achieve greater FAIRness and long-term usability (<xref ref-type="bibr" rid="r1">Arguillas et al., 2022</xref>; <xref ref-type="bibr" rid="r23">Martinez-Ortiz et al., 2023</xref>).</p></list-item>
<list-item><p>Invest in (inter)national networks that can be supportive to reproducibility (e.g. (national) Research Integrity networks, Research Data Alliance (RDA), and Research Software Alliances (ReSA)).</p></list-item>
<list-item><p>Engage with open source projects and communities (<xref ref-type="bibr" rid="r34">Quan, 2022</xref>), as these develop the tools and practices that enable reproducible research (e.g. Project Jupyter community, ROpenSci community).<xref ref-type="fn" rid="fn14"><sup>14</sup></xref></p></list-item>
</list>
</sec>
<sec id="s6c">
<title>6.3. Awareness, Skills and Capacities</title>
<p>Librarians are well-positioned to train and provide guidance on essential aspects of research reproducibility, and can integrate these into existing or emerging efforts. This touches on various steps of the research cycle, in particular:</p>
<list list-type="bullet">
<list-item><p>basic data science skills (e.g. <xref ref-type="bibr" rid="r21">Lyon, 2016</xref>; <xref ref-type="bibr" rid="r30">Oliver et al., 2019</xref>) including tidy data, data cleaning, data visualization via diverse tools (e.g. The Carpentries modules on data and software);</p></list-item>
<list-item><p>research data management (e.g. <xref ref-type="bibr" rid="r21">Lyon, 2016</xref>; <xref ref-type="bibr" rid="r30">Oliver et al., 2019</xref>): data management plans, documentation and sharing of data and code, FAIR and open data;</p></list-item>
<list-item><p>publishing: data and software availability statements, basics of peer review, data citation, publication formats (data or software papers, etc.) and publication venues (e.g. innovative publication platforms) that support reproducibility;</p></list-item>
<list-item><p>reproducibility hackathons (repro-hacks), in collaboration with researchers (<xref ref-type="bibr" rid="r14">Hettne et al., 2020</xref>);</p></list-item>
<list-item><p>guidance on tools that help to make research more reproducible (<xref ref-type="bibr" rid="r21">Lyon, 2016</xref>).</p></list-item>
</list>
<p>Moreover, these new roles can be made visible by adding support for research reproducibility to job descriptions, e.g. of data stewards or open science advisors and community managers. However, this also requires commitment from libraries and library communities to invest further in activities such as shared educational programmes and practical toolkits and guidance on transparency and reproducibility of research.</p>
<p>These aims and actions also link very well to libraries&#x2019; core values and strategic priorities: libraries as engaged and trusted hubs are committed to i) advancing research and scholarship that is increasingly equitable and open, ii) providing state-of-the-art and scholar-focused services and iii) upskilling the library workforce. The primary aim is to address the needs of their user communities, in collaboration with local, national and international partners (<xref ref-type="bibr" rid="r2">Association of Research Libraries [ARL], 2022</xref>; LIBER, 2022).</p>
<p>However, while there are good reasons for libraries to engage in activities in support of research reproducibility we would also like to point out several challenges and barriers. Research methodologies and tools have become more data-intense across almost all disciplines. As already pointed out, definitions of research reproducibility and views on its relevance vary by discipline. Librarians are on the one hand knowledgeable in the information needs of the disciplines, however, that does not necessarily include an up-to-date in-depth familiarity with the research questions and methodologies. They might therefore not be well-equipped to provide advice on all necessary steps and available options related to research reproducibility. A further challenge is to shift or secure additional resources and to address long-term sustainability of infrastructures in support of research reproducibility and open research. In particular in a fragmented landscape libraries are not necessarily recognised in their own institution as crucial players and providers of these infrastructures and services, addressing generic and disciplinary needs.</p>
<p>We think that research libraries are an essential part of the puzzle when it comes to achieving research reproducibility. It may vary greatly how research communities, universities and national research agenda&#x2019;s address the reproducibility challenge. However, we hope that with these recommendations and highlighting the libraries as a key stakeholder group, the research library community will step up their engagement for aligning their strategies and common goals in this area.</p>
<p>As discussed above, collaboration with other stakeholders (funders, publishers, service providers, researchers and their networks) on the local and (inter)national level is crucial to reach consensus on priorities and required investments, to provide guidance and support for implementation, to track progress made and to address emerging needs, from an overarching and where needed disciplinary perspective.</p>
</sec>
</sec>
</body>
<back>
<sec id="s7">
<title>Note from the Authors</title>
<p>The quotes used in this article have been edited by us only to remove grammatical errors and improve readability. No adjustments were made to the content.</p>
</sec>
<sec id="s8" sec-type="data-availability">
<title>Data Availability</title>
<p>A data management plan, all datasets and code that have been created in the context of the Publishing Reproducible Research Outcomes (PRRO) project are available via Zenodo: <ext-link ext-link-type="uri" xlink:href="https://zenodo.org/communities/ke-prro/">https://zenodo.org/communities/ke-prro/</ext-link>. In particular, this includes datasets related to the literature review, conducted interviews and the analysis of social media data.</p>
</sec>
<ack>
<title>Acknowledgement</title>
<p>This article builds on the previous work and research done by Knowledge Exchange and Research Consulting. The authors would like to thank all involved colleagues and organisations for all their contributions, reflected in the final report <italic>The Art of Publishing Reproducible Research Outputs: Supporting emerging practices through cultural and technological innovation</italic> and which has been the essential groundwork for this endeavor.</p>
</ack>
<fn-group>
<title>Notes</title>
<fn id="fn1"><p>The &#x201C;reproducibility crisis&#x201D;, also known as replication or replicability crisis, refers to the observation that a large proportion of scientific studies published across disciplines do not replicate (<xref ref-type="bibr" rid="r31">Open Science Collaboration, 2015</xref>). Cf. Framework for Open and Reproducible Research Training (FORRT) Glossary: Reproducibility crisis, <ext-link ext-link-type="uri" xlink:href="https://forrt.org/glossary/reproducibility-crisis-aka-replicab/">https://forrt.org/glossary/reproducibility-crisis-aka-replicab/</ext-link>; <xref ref-type="bibr" rid="r33">Parsons et al., 2022</xref>.</p></fn>
<fn id="fn2"><p>Knowledge Exchange: Open Access, <ext-link ext-link-type="uri" xlink:href="https://www.knowledge-exchange.info/projects/project/open-access">https://www.knowledge-exchange.info/projects/project/open-access</ext-link>.</p></fn>
<fn id="fn3"><p>National Institutes of Health (NIH), <ext-link ext-link-type="uri" xlink:href="https://www.nih.gov/research-training/rigor-reproducibility">https://www.nih.gov/research-training/rigor-reproducibility</ext-link>.</p></fn>
<fn id="fn4"><p>Project TIER Protocol 4.0, <ext-link ext-link-type="uri" xlink:href="https://www.projecttier.org/tier-protocol/protocol-4-0">https://www.projecttier.org/tier-protocol/protocol-4-0</ext-link>.</p></fn>
<fn id="fn5"><p>The Turing Way community and handbook, <ext-link ext-link-type="uri" xlink:href="https://www.turing.ac.uk/research/research-projects/turing-way">https://www.turing.ac.uk/research/research-projects/turing-way</ext-link>.</p></fn>
<fn id="fn6"><p>Research Compendium, <ext-link ext-link-type="uri" xlink:href="https://research-compendium.science/">https://research-compendium.science/</ext-link>.</p></fn>
<fn id="fn7"><p>Knowledge Exchange: Open Access, <ext-link ext-link-type="uri" xlink:href="https://www.knowledge-exchange.info/projects/project/open-access">https://www.knowledge-exchange.info/projects/project/open-access</ext-link>.</p></fn>
<fn id="fn8"><p>New York University: Research Data Management, <ext-link ext-link-type="uri" xlink:href="https://guides.nyu.edu/data_management">https://guides.nyu.edu/data_management</ext-link>; Oregon Health &#x0026; Science University: Research Data and Reproducibility, <ext-link ext-link-type="uri" xlink:href="https://libguides.ohsu.edu/research-data-services/OHSU/Library">https://libguides.ohsu.edu/research-data-services/OHSU/Library</ext-link>.</p></fn>
<fn id="fn9"><p>ReproHack Hub, <ext-link ext-link-type="uri" xlink:href="https://www.reprohack.org/">https://www.reprohack.org/</ext-link>.</p></fn>
<fn id="fn10"><p>ReproducibiliTea, <ext-link ext-link-type="uri" xlink:href="https://reproducibilitea.org/">https://reproducibilitea.org/</ext-link>.</p></fn>
<fn id="fn11"><p>The Carpentries, <ext-link ext-link-type="uri" xlink:href="https://carpentries.org/">https://carpentries.org/</ext-link>.</p></fn>
<fn id="fn12"><p>UKRN: International Reproducibility Networks, <ext-link ext-link-type="uri" xlink:href="https://www.ukrn.org/international-networks/">https://www.ukrn.org/international-networks/</ext-link>.</p></fn>
<fn id="fn13"><p>E.g. Cascad, <ext-link ext-link-type="uri" xlink:href="https://www.cascad.tech/">https://www.cascad.tech/</ext-link>, Docker, <ext-link ext-link-type="uri" xlink:href="https://www.docker.com/resources/what-container/">https://www.docker.com/resources/what-container/</ext-link>, GitHub <ext-link ext-link-type="uri" xlink:href="https://github.com/">https://github.com/</ext-link>, GitLab, <ext-link ext-link-type="uri" xlink:href="https://about.gitlab.com/">https://about.gitlab.com/</ext-link>, Binder <ext-link ext-link-type="uri" xlink:href="https://mybinder.org/">https://mybinder.org/</ext-link>, Whole Tale <ext-link ext-link-type="uri" xlink:href="https://wholetale.org">https://wholetale.org</ext-link> and ReproZip, <ext-link ext-link-type="uri" xlink:href="https://www.reprozip.org">https://www.reprozip.org</ext-link>.</p></fn>
<fn id="fn14"><p>Project Jupyter community, <ext-link ext-link-type="uri" xlink:href="https://jupyter.org/community">https://jupyter.org/community</ext-link>; ROpenSci, <ext-link ext-link-type="uri" xlink:href="https://ropensci.org/">https://ropensci.org/</ext-link>.</p></fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="r1"><mixed-citation>Arguillas, F., Christian, T.-M., Gooch, M., Honeyman, T., &#x0026; Peer, L. (2022). <italic>10 Things for Curating Reproducible and FAIR Research</italic> (Version 1.1). Research Data Alliance. <ext-link ext-link-type="doi" xlink:href="10.15497/RDA00074">https://doi.org/10.15497/RDA00074</ext-link></mixed-citation></ref>
<ref id="r2"><mixed-citation>Association of Research Libraries (ARL). (2022). <italic>ARL&#x2019;s 2023-2026 Action Plan.</italic> <ext-link ext-link-type="uri" xlink:href="https://www.arl.org/who-we-are/#section-action-plan">https://www.arl.org/who-we-are/#section-action-plan</ext-link></mixed-citation></ref>
<ref id="r3"><mixed-citation>Chassanoff, A., AlNoamany, Y., Thornton, K., &#x0026; Borghi, J. (2018). Software curation in research libraries: Practice and promise. <italic>Journal of Librarianship and Scholarly Communication, 6</italic>(1), Article eP2239. <ext-link ext-link-type="doi" xlink:href="10.7710/2162-3309.2239">https://doi.org/10.7710/2162-3309.2239</ext-link></mixed-citation></ref>
<ref id="r4"><mixed-citation>Chiarelli, A., Loffreda, L., &#x0026; Johnson, R. (2021a, April 8). <italic>Publishing reproducible research outputs - Literature findings.</italic> [Conference presentation]. Literature findings. <ext-link ext-link-type="doi" xlink:href="10.5281/zenodo.4675457">https://doi.org/10.5281/zenodo.4675457</ext-link></mixed-citation></ref>
<ref id="r5"><mixed-citation>Chiarelli, A., Loffreda, L., &#x0026; Johnson, R. (2021b). <italic>The Art of Publishing Reproducible Research Outputs: Supporting emerging practices through cultural and technological innovation.</italic> Knowledge Exchange. <ext-link ext-link-type="doi" xlink:href="10.5281/zenodo.5521077">https://doi.org/10.5281/zenodo.5521077</ext-link></mixed-citation></ref>
<ref id="r6"><mixed-citation>Coalition on Advancing Research Assessment (CoARA). (2022). <italic>Agreement on reforming research assessment</italic>. <ext-link ext-link-type="uri" xlink:href="https://coara.eu/agreement/the-agreement-full-text/">https://coara.eu/agreement/the-agreement-full-text/</ext-link></mixed-citation></ref>
<ref id="r7"><mixed-citation>Council of the European Union. (2022). <italic>Research assessment and implementation of Open Science</italic>. <ext-link ext-link-type="uri" xlink:href="https://www.consilium.europa.eu/media/56958/st10126-en22.pdf">https://www.consilium.europa.eu/media/56958/st10126-en22.pdf</ext-link></mixed-citation></ref>
<ref id="r8"><mixed-citation>Council of the European Union. (2023). <italic>High-quality, transparent, open, trustworthy and equitable scholarly publishing.</italic> <ext-link ext-link-type="uri" xlink:href="https://data.consilium.europa.eu/doc/document/ST-9616-2023-INIT/en/pdf">https://data.consilium.europa.eu/doc/document/ST-9616-2023-INIT/en/pdf</ext-link></mixed-citation></ref>
<ref id="r9"><mixed-citation>Diaba-Nuhoho, P., &#x0026; Amponsah-Offeh, M. (2021). Reproducibility and research integrity: The role of scientists and institutions. <italic>BMC Research Notes</italic>, <italic>14</italic>(1). <ext-link ext-link-type="doi" xlink:href="10.1186/s13104-021-05875-3">https://doi.org/10.1186/s13104-021-05875-3</ext-link></mixed-citation></ref>
<ref id="r10"><mixed-citation>European Commission. Directorate General for Research and Innovation. (2021). <italic>Towards a reform of the research assessment system: Scoping report.</italic> Publications Office. <ext-link ext-link-type="uri" xlink:href="https://data.europa.eu/doi/10.2777/707440">https://data.europa.eu/doi/10.2777/707440</ext-link></mixed-citation></ref>
<ref id="r11"><mixed-citation>European Commission. (2022a). <italic>Assessing the reproducibility of research results in EU Framework Programmes for Research: Final report.</italic> Publications Office. <ext-link ext-link-type="uri" xlink:href="https://data.europa.eu/doi/10.2777/186782">https://data.europa.eu/doi/10.2777/186782</ext-link></mixed-citation></ref>
<ref id="r12"><mixed-citation>European Commission. (2022b). <italic>Strategic Research and Innovation Agenda (SRIA) of the European Open Science Cloud (EOSC)</italic>. Publications Office of the European Union. <ext-link ext-link-type="uri" xlink:href="https://data.europa.eu/doi/10.2777/935288">https://data.europa.eu/doi/10.2777/935288</ext-link></mixed-citation></ref>
<ref id="r13"><mixed-citation>Frey, J., Roche, J., Kautonen, H., Schmidt, B., Tsakonas, G., van Wijngaarden, H., Smit, A., Cassafieres, C. S., Burman, L., Verheusen, A., Knigge, R., Joss-Bethlehem, E., &#x0026; van der Graaf, M. (2023). <italic>LIBER Strategy 2023-2027</italic>. LIBER. <ext-link ext-link-type="doi" xlink:href="10.5281/zenodo.7696568">https://doi.org/10.5281/zenodo.7696568</ext-link></mixed-citation></ref>
<ref id="r14"><mixed-citation>Hettne, K., Proppert, R. K. K., Nab, L., Saunero, L. P. R., &#x0026; Gawehns, D. (2020). <italic>ReprohackNL 2019: How libraries can promote research reproducibility through community engagement</italic>. SocArXiv Papers. <ext-link ext-link-type="doi" xlink:href="10.31235/osf.io/6f4zv">https://doi.org/10.31235/osf.io/6f4zv</ext-link></mixed-citation></ref>
<ref id="r15"><mixed-citation>Konkol, M., N&#x00FC;st, D., &#x0026; Goulier, L. (2020). Publishing computational research&#x2014;A review of infrastructures for reproducible and transparent scholarly communication. <italic>Research Integrity and Peer Review</italic>, <italic>5</italic>(10), 1&#x2013;8. <ext-link ext-link-type="doi" xlink:href="10.1186/s41073-020-00095-y">https://doi.org/10.1186/s41073-020-00095-y</ext-link></mixed-citation></ref>
<ref id="r16"><mixed-citation>KNAW. (2018). <italic>Replication studies &#x2013; Improving reproducibility in the empirical sciences</italic>. Amsterdam, KNAW.</mixed-citation></ref>
<ref id="r17"><mixed-citation>LaPolla, F. W. Z., Bakker, C. J., Exner, N., Montnech, T., Surkis, A., &#x0026; Ye, H. (2022). Rigor and reproducibility instruction in academic medical libraries. <italic>Journal of the Medical Library Association</italic>, <italic>110</italic>(3), Article 3. <ext-link ext-link-type="doi" xlink:href="10.5195/jmla.2022.1443">https://doi.org/10.5195/jmla.2022.1443</ext-link></mixed-citation></ref>
<ref id="r18"><mixed-citation>Lasser, J. (2020). Creating an executable paper is a journey through Open Science. <italic>Communications Physics, 143</italic>(3), 1&#x2013;5. <ext-link ext-link-type="doi" xlink:href="10.1038/s42005-020-00403-4">https://doi.org/10.1038/s42005-020-00403-4</ext-link></mixed-citation></ref>
<ref id="r19"><mixed-citation>Library Publishing Coalition Ethical Framework Task Force. (2023). <italic>An Ethical Framework for Library Publishing, Version 2.0</italic>. Library Publishing Coalition <ext-link ext-link-type="doi" xlink:href="10.5703/1288284317619">https://doi.org/10.5703/1288284317619</ext-link></mixed-citation></ref>
<ref id="r20"><mixed-citation>Loffreda, L., &#x0026; Chiarelli, A. (2021). <italic>Publishing Reproducible Research Outputs&#x2014;Literature sources and Thematic coding</italic> (Version 1) [Data set]. Zenodo. <ext-link ext-link-type="doi" xlink:href="10.5281/ZENODO.4748748">https://doi.org/10.5281/ZENODO.4748748</ext-link></mixed-citation></ref>
<ref id="r21"><mixed-citation>Lyon, L. (2016). Transparency: The emerging third dimension of Open Science and Open Data. <italic>LIBER Quarterly: The Journal of the Association of European Research Libraries</italic>, <italic>25</italic>(4), 153&#x2013;171. <ext-link ext-link-type="doi" xlink:href="10.18352/lq.10113">https://doi.org/10.18352/lq.10113</ext-link></mixed-citation></ref>
<ref id="r22"><mixed-citation>Martinez-Lavanchy, P. M., van Sch&#x00F6;ll, P., &#x0026; Zormpa, E. (2022). <italic>TU Delft Research Data Management 101 course</italic>. Zenodo. <ext-link ext-link-type="doi" xlink:href="10.5281/ZENODO.6325919">https://doi.org/10.5281/ZENODO.6325919</ext-link></mixed-citation></ref>
<ref id="r23"><mixed-citation>Martinez-Ortiz, C., Martinez Lavanchy, P., Sesink, L., Olivier, B. G., Meakin, J., de Jong, M., &#x0026; Cruz, M. (2023). <italic>Practical guide to Software Management Plans</italic> (Version 1.1). Netherlands eScience Center and NOW. <ext-link ext-link-type="doi" xlink:href="10.5281/ZENODO.7589725">https://doi.org/10.5281/ZENODO.7589725</ext-link></mixed-citation></ref>
<ref id="r24"><mixed-citation>Munaf&#x00F2;, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., &#x0026; Ioannidis, J. P. A. (2017). A manifesto for reproducible science. <italic>Nature Human Behaviour</italic>, <italic>1</italic>(1), 1&#x2013;9. <ext-link ext-link-type="doi" xlink:href="10.1038/s41562-016-0021">https://doi.org/10.1038/s41562-016-0021</ext-link></mixed-citation></ref>
<ref id="r25"><mixed-citation>National Academies of Sciences, Engineering, and Medicine (NASEM). (2018). <italic>Open science by design: Realizing a vision for 21st century research</italic>. <ext-link ext-link-type="doi" xlink:href="10.17226/25116">https://doi.org/10.17226/25116</ext-link></mixed-citation></ref>
<ref id="r26"><mixed-citation>National Academies of Sciences, Engineering, and Medicine (NASEM). (2019). <italic>Reproducibility and replicability in science</italic>. <ext-link ext-link-type="doi" xlink:href="10.17226/25303">https://doi.org/10.17226/25303</ext-link></mixed-citation></ref>
<ref id="r27"><mixed-citation>National Institutes of Health. (2014). <italic>Principles and guidelines for reporting preclinical research</italic>. <ext-link ext-link-type="uri" xlink:href="https://grants.nih.gov/policy/reproducibility/principles-guidelines-reporting-preclinical-research.htm">https://grants.nih.gov/policy/reproducibility/principles-guidelines-reporting-preclinical-research.htm</ext-link></mixed-citation></ref>
<ref id="r28"><mixed-citation>Neylon, C., Bels&#x00F8;, R., Bijsterbosch, M., Cordewener, B., Foncel, J., Friesike, S., Fyfe, A., Jacobs, N., Katerbow, M., Laakso, M., &#x0026; Sesink, L. (2019). <italic>Open Scholarship and the need for collective action</italic>. Zenodo. <ext-link ext-link-type="doi" xlink:href="10.5281/zenodo.3454688">https://doi.org/10.5281/zenodo.3454688</ext-link></mixed-citation></ref>
<ref id="r29"><mixed-citation>Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck S., Chambers, C., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D., Hesse, A., Humphreys, M., &#x2026; Boycan, E. (2023, September 9). <italic>Transparency and Openness Promotion (TOP) Guidelines</italic>. <ext-link ext-link-type="uri" xlink:href="https://osf.io/9f6gx">https://osf.io/9f6gx</ext-link>.</mixed-citation></ref>
<ref id="r30"><mixed-citation>Oliver, J. C., Kollen, C., Hickson, B., &#x0026; Rios, F. (2019). Data science support at the academic library. <italic>Journal of Library Administration</italic>, <italic>59</italic>(3), 241&#x2013;257. <ext-link ext-link-type="doi" xlink:href="10.1080/01930826.2019.1583015">https://doi.org/10.1080/01930826.2019.1583015</ext-link></mixed-citation></ref>
<ref id="r31"><mixed-citation>Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. <italic>Science, 349</italic>(6251), 943. <ext-link ext-link-type="doi" xlink:href="10.1126/science.aac4716">https://doi.org/10.1126/science.aac4716</ext-link></mixed-citation></ref>
<ref id="r32"><mixed-citation>Packer, E. (2020, August 24). <italic>ELife launches Executable Research Articles for publishing computationally reproducible results</italic>. eLife. <ext-link ext-link-type="uri" xlink:href="https://elifesciences.org/for-the-press/eb096af1/elife-launches-executable-research-articles-for-publishing-computationally-reproducible-results">https://elifesciences.org/for-the-press/eb096af1/elife-launches-executable-research-articles-for-publishing-computationally-reproducible-results</ext-link></mixed-citation></ref>
<ref id="r33"><mixed-citation>Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., Norris, E., O&#x2019;Mahony, A., Parker, A. J., Todorovic, A., Pennington, C. R., Garcia-Pelegrin, E., Lazi&#x0107;,A., Robertson, O., Middleton, S. L., Valentini, B., McCuaig, J., Baker, B. J., Collins, E., &#x2026; &#x0026; Aczel, B. (2022). A community-sourced glossary of open scholarship terms. <italic>Nature Human Behaviour, 6</italic>(3), 312&#x2013;318. <ext-link ext-link-type="doi" xlink:href="10.1038/s41562-021-01269-4">https://doi.org/10.1038/s41562-021-01269-4</ext-link></mixed-citation></ref>
<ref id="r34"><mixed-citation>Quan, J. (2021). Toward reproducibility: Academic libraries and open science. In J. Herndon, (Ed.), <italic>Data science in the library: Tools and strategies for supporting data-driven research and instruction</italic> (pp. 49&#x2013;66). Facet. <ext-link ext-link-type="doi" xlink:href="10.29085/9781783304615.004">https://doi.org/10.29085/9781783304615.004</ext-link></mixed-citation></ref>
<ref id="r35"><mixed-citation>Rahal, R.-M., Hamann, H., Brohmer, H., &#x0026; Pethig, F. (2022). Sharing the recipe: Reproducibility and replicability in research across disciplines. <italic>Research Ideas and Outcomes</italic>, <italic>8</italic>, Article e89980. <ext-link ext-link-type="doi" xlink:href="10.3897/rio.8.e89980">https://doi.org/10.3897/rio.8.e89980</ext-link></mixed-citation></ref>
<ref id="r36"><mixed-citation>Rethlefsen, M. L., Lackey, M. J., &#x0026; Zhao, S. (2018). Building capacity to encourage research reproducibility and #MakeResearchTrue. <italic>Journal of the Medical Library Association</italic>, <italic>106</italic>(1), 113&#x2013;119. <ext-link ext-link-type="doi" xlink:href="10.5195/jmla.2018.273">https://doi.org/10.5195/jmla.2018.273</ext-link></mixed-citation></ref>
<ref id="r37"><mixed-citation>Ross-Hellauer, T., Klebel, T., Bannach-Brown, A., Horbach, S. P. J. M., Jabeen, H., Manola, N., Metodiev, T., Papageorgiou, H., Reczko, M., Sansone, S.-A., Schneider, J., Tijdink, J., &#x0026; Vergoulis, T. (2022). TIER2: enhancing trust, integrity and efficiency in research through next-level reproducibility. <italic>Research Ideas and Outcomes</italic>, <italic>8</italic>, Article e98457 <ext-link ext-link-type="doi" xlink:href="10.3897/rio.8.e98457">https://doi.org/10.3897/rio.8.e98457</ext-link></mixed-citation></ref>
<ref id="r38"><mixed-citation>Sayre, F., &#x0026; Riegelman, A. (2019). Replicable services for reproducible research: A model for academic libraries. <italic>College &#x0026; Research Libraries, 80</italic>(2), 260&#x2013;272. <ext-link ext-link-type="doi" xlink:href="10.5860/crl.80.2.260">https://doi.org/10.5860/crl.80.2.260</ext-link></mixed-citation></ref>
<ref id="r39"><mixed-citation>Stanford Encyclopedia of Philosophy. (2018). <italic>Reproducibility of scientific results</italic>. <ext-link ext-link-type="uri" xlink:href="https://plato.stanford.edu/entries/scientific-reproducibility/">https://plato.stanford.edu/entries/scientific-reproducibility/</ext-link></mixed-citation></ref>
<ref id="r40"><mixed-citation>Steeves, V. (2017). Reproducibility librarianship. <italic>Collaborative Librarianship</italic>, <italic>9</italic>(2), 80&#x2013;89. <ext-link ext-link-type="uri" xlink:href="https://digitalcommons.du.edu/collaborativelibrarianship/vol9/iss2/4">https://digitalcommons.du.edu/collaborativelibrarianship/vol9/iss2/4</ext-link></mixed-citation></ref>
<ref id="r41"><mixed-citation>Sumner, J., Haynes, L., Nathan, S., Hudson-Vitale, C., &#x0026; McIntosh, L. D. (2020). <italic>Reproducibility and reporting practices in COVID-19 preprint manuscripts</italic>. medRxiv. <ext-link ext-link-type="doi" xlink:href="10.1101/2020.03.24.20042796">https://doi.org/10.1101/2020.03.24.20042796</ext-link></mixed-citation></ref>
<ref id="r42"><mixed-citation>The Turing Way Community. (2022). <italic>The Turing Way: A handbook for reproducible, ethical and collaborative research</italic> (1.0.2). Zenodo. <ext-link ext-link-type="doi" xlink:href="10.5281/zenodo.7625728">https://doi.org/10.5281/zenodo.7625728</ext-link></mixed-citation></ref>
<ref id="r43"><mixed-citation>Tsang, E., &#x0026; Maciocci, G. (2020, August 24). <italic>Welcome to a new ERA of reproducible publishing</italic>. ELife. <ext-link ext-link-type="uri" xlink:href="https://elifesciences.org/labs/dc5acbde/welcome-to-a-new-era-of-reproducible-publishing">https://elifesciences.org/labs/dc5acbde/welcome-to-a-new-era-of-reproducible-publishing</ext-link></mixed-citation></ref>
<ref id="r44"><mixed-citation>Wellcome Trust (2017, July 10). <italic>Data, software and materials management and sharing policy.</italic> <ext-link ext-link-type="uri" xlink:href="https://wellcome.org/grant-funding/guidance/data-software-materials-management-and-sharing-policy">https://wellcome.org/grant-funding/guidance/data-software-materials-management-and-sharing-policy</ext-link></mixed-citation></ref>
<ref id="r45"><mixed-citation>Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., &#x2026; Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. <italic>Scientific Data</italic>, <italic>3</italic>(1), Article 160018. <ext-link ext-link-type="doi" xlink:href="10.1038/sdata.2016.18">https://doi.org/10.1038/sdata.2016.18</ext-link></mixed-citation></ref>
</ref-list>
</back>
</article>
