<?xml version="1.0" encoding="us-ascii"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">
<article article-type="research-article" xml:lang="EN" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">LIBER</journal-id>
<journal-title-group>
<journal-title>LIBER QUARTERLY</journal-title>
</journal-title-group>
<issn pub-type="epub">2213-056X</issn>
<publisher>
<publisher-name>Uopen Journals</publisher-name>
<publisher-loc>Utrecht, The Netherlands</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">lq.10113</article-id>
<article-id pub-id-type="doi">10.18352/lq.10113</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Transparency: The Emerging Third Dimension of Open Science and Open Data</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Lyon</surname>
<given-names>Liz</given-names>
</name>
<email>elyon@pitt.edu</email>
<xref ref-type="aff" rid="aff1"/>
</contrib>
<aff id="aff1">Visiting Professor, School of Information Sciences, University of Pittsburgh, USA</aff>
</contrib-group>
<pub-date pub-type="epub">
<month>3</month>
<year>2016</year>
</pub-date>
<volume>25</volume>
<issue>4</issue>
<fpage>153</fpage>
<lpage>171</lpage>
<permissions>
<copyright-statement>Copyright 2016, The copyright of this article remains with the author</copyright-statement>
<copyright-year>2016</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See <uri xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</uri>.</license-p>
</license>
</permissions>
<self-uri xlink:href="https://www.liberquarterly.eu/article/10.18352/lq.10113"/>
<abstract>
<p>This paper presents an exploration of the concept of research transparency. The policy context is described and situated within the broader arena of open science. This is followed by commentary on transparency within the research process, which includes a brief overview of the related concept of reproducibility and the associated elements of research integrity, fraud and retractions. A two-dimensional model or continuum of open science is considered and the paper builds on this foundation by presenting a three-dimensional model, which includes the additional axis of &#x2018;transparency&#x2019;. The concept is further unpacked and preliminary definitions of key terms are introduced: transparency, transparency action, transparency agent and transparency tool. An important linkage is made to the research lifecycle as a setting for potential transparency interventions by libraries. Four areas are highlighted as foci for enhanced engagement with transparency goals: Leadership and Policy, Advocacy and Training, Research Infrastructures and Workforce Development.</p>
</abstract>
<kwd-group>
<kwd>research transparency</kwd>
<kwd>open science</kwd>
<kwd>research data services</kwd>
<kwd>LIS education</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>1. Introduction</title>
<p>This paper presents an exploration of the concept of research transparency as a precursor to a new program of qualitative research at the School of Information Sciences (iSchool) University of Pittsburgh, which is investigating the perceptions of research transparency amongst librarians in universities and research institutions. The motivations for exploring this concept (and the associated concept of reproducibility), are multi-faceted. Firstly at a national level, transparency has been highlighted by government leaders as a characteristic of open government (<xref ref-type="bibr" rid="r19">Holdren, Orszag, &#x0026; Prouty, 2009</xref>). Secondly, federal funding agencies with significant research portfolios, such as the US National Institutes of Health (NIH), have articulated their policy and plans for more rigorous research (<xref ref-type="bibr" rid="r22">Lauer, 2015</xref>; <xref ref-type="bibr" rid="r29">NIH, 2015a</xref>) supported by revised grant application guidance (<xref ref-type="bibr" rid="r30">NIH, 2015b</xref>). Thirdly, leaders of professional societies such as the National Academy of Sciences, have voiced their concerns about the current reporting of science (<xref ref-type="bibr" rid="r5">Cicerone, 2015</xref>), and other organizations such as the Federation of American Societies for Experimental Biology (FASEB), have published Recommendations for enhancing research reproducibility, which include transparency parameters (FASEB, 2016). Finally some publishers such as PLOS Biology, are taking a proactive approach to encourage reproducibility efforts (<xref ref-type="bibr" rid="r10">Denker, 2016</xref>).</p>
<p>In laying the foundation, we first examine &#x2018;transparency&#x2019; within the policy context, noting the broader policy arena of open science, which has been articulated by research funding agencies, national governments and other interested parties. This is followed by commentary on transparency within the research process, which includes a brief overview of the related concept of reproducibility and introduces the elements of research integrity, fraud and retractions. In the later sections, an existing two-dimensional model or continuum, is revisited and the paper builds on this framework by presenting a new three-dimensional model, which includes the additional axis of &#x2018;transparency&#x2019;. In order to acquire a better understanding of its relevance to libraries, the concept of transparency is further unpacked in terms of definitions and vocabulary and selected key terms are introduced. We review opportunities for potential transparency interventions and situate these within the research lifecycle. The final section considers the practical implications for library and information services. The emergence of a range of new research data services is attracting much professional debate as a key area of development for academic and research libraries. In this context, four areas are highlighted as foci for enhanced engagement with transparency goals: Leadership and Policy, Advocacy and Training, Research Infrastructures and Workforce Development.</p>
</sec>
<sec id="s2">
<title>2. Two Perspectives on Transparency</title>
<sec id="s2a">
<title>2.1 Transparency in the Policy Context</title>
<p>Governments and policy-setting bodies in North America, Europe and Australasia, as well as in other global nations, have set a clear agenda for open science and open data, which is supported by research funding agency requirements for Data Management or Data Sharing Plans as a component of submitted research proposals e.g. NSF and NIH, UK Research Councils and the European Commission. Research transparency is an identified concept, principle or value articulated within many of these policy statements. The thirty countries of the OECD identified transparency as one of the principles in the OECD Guidelines for access to research data from public funding (<xref ref-type="bibr" rid="r34">OECD, 2007</xref>). These guidelines highlight four factors to consider in ensuring transparency: &#x201C;<italic>documentation on available datasets and conditions of use should be easy to find&#x201D;, &#x201C;research agencies should actively disseminate information on research data policies&#x201D;, &#x201C;members of the various research communities should assist in establishing agreements on standards for cataloguing data</italic>&#x201D; and <italic>&#x201C;Information on data management and access conditions should be communicated among data archives and data producing institutions</italic>&#x201D;. In the United States, the Obama Administration released a memorandum on Transparency and Open Government (<xref ref-type="bibr" rid="r19">Holdren et al., 2009</xref>) which set out three specific actions for departments and government agencies: Transparency, Participation and Collaboration. Transparency as a value has been discussed in ideological terms by <xref ref-type="bibr" rid="r13">Etzioni (2010)</xref>, who describes the strong variant relating to regulatory contexts and disclosure. The <xref ref-type="bibr" rid="r39">Royal Society Report (2012)</xref> &#x201C;Science as an Open Enterprise&#x201D; makes reference to &#x201C;<italic>transparent policies for custodianship, data quality and access</italic>&#x201D; in outlining a set of principles of stewardship which should be shared by custodians of scientific work. In the UK, the Research Councils have published a set of Common Principles on Data Policy (originally published in 2011 and revised: <xref ref-type="bibr" rid="r38">RCUK, 2015</xref>) which state: &#x201C;<italic>Making research data available to users is a core part of the Research Councils&#x2019; remit and is undertaken in a variety of ways. We are committed to transparency and to a coherent approach across the research base&#x201D;.</italic> The G8 countries Open Data Charter (<xref ref-type="bibr" rid="r18">Gov.UK, 2013</xref>) states that &#x201C;<italic>Open data can increase transparency about what government and business are doing</italic>&#x201D; and Principle 4 states that &#x201C;<italic>We will be transparent about our own data collection, standards, and publishing processes, by documenting all of these related processes online</italic>&#x201D;. Furthermore, transparency was an integral element of the European Commission Horizon 2020 program calls, which highlighted transparency elements within innovation pilots for open government. In 2015, the OECD published a substantive report on open science, which identifies &#x201C;<italic>increasing transparency and quality in the research validation process&#x201D;</italic> as a rationale for open science and open data (<xref ref-type="bibr" rid="r35">OECD, 2015</xref>); also in 2015, four major global science organizations (the International Council for Science, the InterAcademy Partnership, the World Academy of Sciences and the International Social Science Council) published an international accord, which includes the assertion that &#x201C;<italic>Openness and transparency have formed the bedrock on which the progress of science in the modern era has been based&#x201D;</italic> (<xref ref-type="bibr" rid="r20">ICSU, 2015</xref>).</p>
</sec>
<sec id="s2b">
<title>2.2 Transparency in the Research Process</title>
<p>Moving from the policy perspective to look more widely at the literature describing the practice of research, there has been a gradual increase in attention given to transparency concepts and reproducible science protocols, processes and products. At the disciplinary level, there is considerable diversity in open practices with certain domains having well-established norms for data release (e.g. astronomy and genomics), whilst in other disciplines, notably across the humanities and social sciences, open data practices are less common. <xref ref-type="bibr" rid="r37">Peng (2011)</xref>, for example, speaks to these diversity differences. Addressing reproducibility in computational science, Peng observes disciplinary differences in the &#x2018;culture of replication&#x2019; and suggests that replication may be hindered by the size of datasets and the amount of computing power, time, and money necessary to reproduce the study. <xref ref-type="bibr" rid="r17">Gezelter (n.d.)</xref> expands on the importance of verifiability in good science and in particular notes that we need &#x201C;<italic>verifiability in practice as well as verifiability in principle</italic>&#x201D;. <xref ref-type="bibr" rid="r6">Collins and Tabak (2014)</xref> call for enhanced reproducibility in research funded by the US National Institutes of Health, and describe positive steps associated with publishing practice and scholarly communications.</p>
<p>Examples of uncertain outcomes and flawed research due to a lack of transparency are highlighted in two articles in <xref ref-type="bibr" rid="r45"><italic>The Economist</italic> (2010</xref>, <xref ref-type="bibr" rid="r46">2013</xref>), a <italic>New York Times</italic> (<xref ref-type="bibr" rid="r3">Carey, 2011</xref>) expos&#x00E9; on fraud in psychological research, and a discussion of reproducibility issues in the biomedical/clinical trials domain (<xref ref-type="bibr" rid="r21">Ince, 2011</xref>). The related controversy around peer review failures and the current &#x201C;retraction epidemic&#x201D; has been described by <xref ref-type="bibr" rid="r15">Fang, Steen, and Casadevall (2013)</xref>, where many retractions (43&#x0025; of the sample) were due to fabrication or falsification. Many of these retractions have led to significant adverse reputational impacts on both institutions and individuals.</p>
<p>In parallel with these high-profile cases of research malpractice, a number of practical approaches towards achieving greater research transparency have emerged. These include institutional data policies that recommend data deposit and data sharing (e.g. <xref ref-type="bibr" rid="r47">University of Bath, 2014</xref>), open data repositories such as Dryad, open software tools for sharing workflows such as Taverna, and electronic laboratory notebooks like RSpace. The Reproducibility Initiative and the associated Reproducibility Project in Cancer Biology, led by the Science Exchange<xref ref-type="fn" rid="fn1">1</xref> and its independent validation service, have taken transparency and trust goals a stage further, by seeking to reproduce the results in fifty high-impact cancer articles published in 2010&#x2013;2012 (<xref ref-type="bibr" rid="r12">Errington et al., 2014</xref>). A similar initiative was taken in psychology, where a collaborative group of researchers have attempted to replicate studies published in three psychological journals in 2008 (<xref ref-type="bibr" rid="r32">Nosek, 2012</xref>; <xref ref-type="bibr" rid="r36">Open Science Collaboration, 2015</xref>). This group has introduced the publication format of Registered Replication Reports (<xref ref-type="bibr" rid="r33">Nosek &#x0026; Lakens, 2014</xref>), where the research methodology undergoes peer-review before the experimental procedures are executed, data collected and results published.</p>
<p>There is also ongoing work to explore different approaches to open data peer review processes (<xref ref-type="bibr" rid="r28">Mayernik, Callaghan, Leigh, Tedds, &#x0026; Worley, 2014</xref>), and efforts to incentivize peer review exemplified by the GigaScience partnership with Publons.<xref ref-type="fn" rid="fn2">2</xref> Scholarly publishers and professional societies have an influential role in the research landscape, with organizations articulating their open data policies and expectations from researchers (<xref ref-type="bibr" rid="r1">American Meteorological Society, 2013</xref>). However the perceived lack of transparency in the scholarly publishing process has led to a proposal for a new Transparency Index which could contain details of editorial boards, data requirements and procedures for dealing with retractions (<xref ref-type="bibr" rid="r27">Marcus &#x0026; Oransky, 2012</xref>). Furthermore, whilst the different stakeholders in the scholarly communications arena are exploring ways to increase the rigor of the research process, libraries are considering the conceptual framing and scope of open science, and delivering innovative research data services.</p>
</sec>
</sec>
<sec id="s3">
<title>3. Open Science Models</title>
<sec id="s3a">
<title>3.1 Open Science in Two Dimensions</title>
<p>Open research has been characterized by a commitment and adherence to accessibility, sharing, transparency, and inclusivity (<xref ref-type="bibr" rid="r2">Borgman, 2015</xref>). Previously, a Continuum of Openness was described by <xref ref-type="bibr" rid="r23">Lyon (2009)</xref> which had two orthogonal axes: &#x2018;Access&#x2019; and &#x2018;Participation&#x2019; or inclusivity. The &#x2018;Access&#x2019; dimension addressed the ability to (freely) locate and retrieve articles, ranging from &#x2018;closed&#x2019; sources, such as a peer-reviewed journal positioned behind a subscription paywall, to an open access institutional repository such as D-Scholarship (university name removed for review). The &#x2018;Participation&#x2019; dimension encompassed the degree of collaboration in research ranging from a single lone scholar, through collaborative professional research teams (team science) to citizen science, where members of the public are partners in the design, implementation and publication of research, such as the Zooniverse projects. Thus a &#x2018;continuum&#x2019; of openness can be described, with organisations, projects and infrastructure platforms positioned on the axes, according to their degree of openness. This two-dimensional model was framed as a Continuum of Openness and is shown in <xref ref-type="fig" rid="fg001">Figure 1</xref>.</p>
<fig id="fg001">
<label>Fig. 1:</label>
<caption><p>2D-model of Open Science (Based on the Continuum of Openness in <xref ref-type="bibr" rid="r23">Lyon, 2009</xref>).</p></caption>
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="figures/Lyon_fig1.jpg"/>
</fig>
<p>The first of these two dimensions (&#x2018;Access&#x2019;) was also explored in more detail by <xref ref-type="bibr" rid="r7">Corrall and Pinfield (2014)</xref>, who constructed a typology of &#x201C;open&#x201D; (open content, open development, open infrastructure) and examined convergence and coherence amongst initiatives in higher education and research. The second dimension (&#x2018;participation&#x2019;) was examined in the context of libraries by <xref ref-type="bibr" rid="r24">Lyon and Beaton (2015)</xref>, who reviewed citizen science initiatives, education and skills development and opportunities for academic libraries, public libraries and Library Schools/iSchools, recognizing a need for greater awareness and education of librarians to the opportunities in this field.</p>
</sec>
<sec id="s3b">
<title>3.2 Introducing a 3D-Model of Open Science</title>
<p>The concept of transparency and the associated term &#x2018;reproducibility&#x2019;, have become increasingly important in the current interdisciplinary research environment which is exemplified by greater data volumes, burgeoning numbers of research publications and a critical requirement for accountability for the expenditure of and impact derived from research supported by public funds, particularly in these times of economic constraint. The Transparency Principle has been presented as &#x201C;<italic>Information on research data and data-producing organization, documentation on the data and conditions attached to the use of the data should be internationally available in a transparent way, ideally through the Internet&#x201D;</italic> (<xref ref-type="bibr" rid="r34">OECD, 2007</xref>). Further interpretations are &#x201C;<italic>full transparency in reporting experimental details so that others may reproduce and extend the findings</italic>&#x201D; (<xref ref-type="bibr" rid="r31">NIH, 2016</xref>) and &#x201C;<italic>the reporting of experimental materials and methods in a manner that provides enough information for others to independently assess and/or reproduce experimental findings</italic>&#x201D; (<xref ref-type="bibr" rid="r16">FASEB, 2016</xref>).</p>
<p>Whilst definitions of reproducibility and repeatability were published over twenty years ago by <xref ref-type="bibr" rid="r44">Taylor and Kuyatt (1994)</xref> in a NIST Technical Note, reproducibility concepts have been examined again from both computational and legal perspectives by <xref ref-type="bibr" rid="r41">Stodden (2009)</xref>. Useful definitions of open science and reproducible research were proposed: <italic>Open or reproducible research is auditable research made openly available</italic>. Furthermore: <italic>Auditable research is where</italic> s<italic>ufficient records (including data and software) have been archived so that the research can be defended later if necessary or differences between independent confirmations resolved. The archive might be private, as with traditional laboratory notebooks.</italic> (<xref ref-type="bibr" rid="r42">Stodden et al., 2013</xref>).</p>
<p>The use of the terms &#x2018;archiving&#x2019; and &#x2018;records&#x2019; in this definition, serve to emphasise the potentially significant role of libraries and information services in supporting open research in the long term. Three distinct categories of reproducibility: computational (access to code, data and other implementation details), empirical (non-computational empirical scientific experiments) and statistical (analyses and assessments) were described by <xref ref-type="bibr" rid="r43">Stodden, Leisch, and Peng (2014)</xref>. <xref ref-type="bibr" rid="r11">Easterbrook (2014)</xref> used a Venn Diagram to illustrate the relationship between repeatability and reproducibility based on open source code, which contributes towards the goal of verifying the computational research which has been undertaken and subsequently published. In this environment of accountability, the original 2D-model (or continuum) of open science has been extended to include a third dimension: &#x201C;Transparency&#x201D;, which is shown in <xref ref-type="fig" rid="fg002">Figure 2</xref>. This third dimension projects the efforts of the Retraction Watch (a blog which tracks retractions), towards the Reproducibility Initiatives in Cancer Biology (an initiative to replicate the results of high-profile cancer studies) and the eLife journal, which has published the outcomes of these reproducibility studies.</p>
<fig id="fg002">
<label>Fig. 2:</label>
<caption><p>3D-Model of Open Science.</p></caption>
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="figures/Lyon_fig2.jpg"/>
</fig>
</sec>
</sec>
<sec id="s4">
<title>4. Further Unpacking the Transparency Concept</title>
<p>What exactly is meant by transparency in the context of research? As a start it may be helpful to consider <italic><bold>what transparency is not</bold></italic>; various terms and phrases capture different aspects of this assertion and some are listed in <xref ref-type="table" rid="tb001">Table 1</xref>. Note that certain transparency terms also align towards Participation/Inclusivity or towards Access, indicating the high degree of inter-dependency and connectedness between the three axes in the model. However the two sub-categories of &#x2018;Clarity&#x2019; and &#x2018;Integrity&#x2019; are unique to the Transparency concept and have particular resonance within research practice, with implications for data, workflows and scholarly publications.</p>
<table-wrap id="tb001">
<label>Table 1:</label>
<caption><p>What Transparency is Not in Twenty Terms.</p></caption>
<table>
<thead>
<tr>
<th valign="top" align="left">Clarity</th>
<th valign="top" align="left">Integrity</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Confusing</td>
<td valign="top" align="left">Not verified</td>
</tr>
<tr>
<td valign="top" align="left">Gray/grey</td>
<td valign="top" align="left">Not validated</td>
</tr>
<tr>
<td valign="top" align="left">Vague</td>
<td valign="top" align="left">Not auditable</td>
</tr>
<tr>
<td valign="top" align="left">Unclear</td>
<td valign="top" align="left">Not supported</td>
</tr>
<tr>
<td valign="top" align="left">Opaque</td>
<td valign="top" align="left">Not described</td>
</tr>
<tr>
<td valign="top" align="left">Ambiguous</td>
<td valign="top" align="left">Not documented</td>
</tr>
<tr>
<td valign="top" align="left">Obscured</td>
<td valign="top" align="left">Not recorded</td>
</tr>
<tr>
<td valign="top" align="left">Implicit</td>
<td valign="top" align="left">Not versioned</td>
</tr>
<tr>
<td valign="top" align="left">Hidden</td>
<td valign="top" align="left">Not tracked</td>
</tr>
<tr>
<td valign="top" align="left">Secret</td>
<td valign="top" align="left">No provenance</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Building on earlier definitions and terminology associated with Reproducibility, and drawing on current computational and organizational vocabulary, some foundational terms and articulations for Transparency concepts in the context of Open Science, are proposed and summarized in <xref ref-type="table" rid="tb002">Table 2</xref>. These may be developed, augmented and extended to build a more comprehensive vocabulary of transparency-related terms.</p>
<table-wrap id="tb002">
<label>Table 2:</label>
<caption><p>Foundational Terms for Transparency in Open Science.</p></caption>
<table>
<thead>
<tr>
<th valign="top" align="left">Term</th>
<th valign="top" align="left">Exposition</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Transparency</td>
<td valign="top" align="left">The outcome from a suite of behaviours which characterize Reproducible Research</td>
</tr>
<tr>
<td valign="top" align="left">Transparency</td>
<td valign="top" align="left">Facilitates and enhances Research Quality, Research Integrity and Trust</td>
</tr>
<tr>
<td valign="top" align="left">Transparency Action</td>
<td valign="top" align="left">Describes a specific intervention which is a component of the processes, protocols and practices within the Research Lifecycle</td>
</tr>
<tr>
<td valign="top" align="left">Transparency Agent</td>
<td valign="top" align="left">Exemplified by the Data Science roles e.g. Data Librarian. These are key components of the Data Fabric (RDA) and supporting Infrastructure; they promote and demonstrate specific behaviours and practices which lead to culture change towards Open Science</td>
</tr>
<tr>
<td valign="top" align="left">Transparency Tool</td>
<td valign="top" align="left">The software and model frameworks which support Open Science practice</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The fundamental concept of Transparency can be considered as an outcome from a combination of different behaviours and practices associated with reproducibility which are implemented by the various actors and stakeholders in the research process. Transparency is generally viewed positively, in particular within the settings of institutional or organizational audits, external scrutiny for research malpractice and demonstrating accountability to funding bodies. Transparent research practices and processes also serve to demonstrate more rigorous methodologies or experimental protocols and to strengthen public perceptions of research quality, integrity and trust in the results, claims, conclusions and assertions derived from research activities. The research lifecycle forms a rich foundation or substrate for grounding thinking about transparency. <xref ref-type="fig" rid="fg003">Figure 3</xref> shows a research lifecycle developed by the University Library System Research Data Management Working Group at the University of Pittsburgh led by Dr Nora Mattern, with some additional &#x2018;Transparency Tracking&#x2019; points highlighted. (Note that in <xref ref-type="fig" rid="fg003">Figure 3</xref>, ELN&#x003D;Electronic Laboratory Notebook; DMP&#x003D;Data management Plan).</p>
<fig id="fg003">
<label>Fig. 3:</label>
<caption><p>Tracking Transparency within the Research Lifecycle.</p></caption>
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="figures/Lyon_fig3.jpg"/>
</fig>
<p>Understanding the specific tasks, actions and transactions associated with the component stages, workflows, objects and infrastructure within the research lifecycle, will help to illustrate the complexity and proliferation of intervention points where greater transparency can be achieved. Research workflows contain tasks, sub-tasks and actions which are executed either by the researcher or by another physical (or software) &#x2018;agent&#x2019; in the process. These proactive interventions can be characterized by Transparency Actions (or verbs) and examples include &#x2018;describe&#x2019;, &#x2018;identify&#x2019; and &#x2018;share&#x2019;. Transparency Agents can be characterized as defined roles or named individuals or organisations, who execute a specific action or intervention. They are exemplified by six new Data Science roles described by <xref ref-type="bibr" rid="r25">Lyon and Brenner (2015)</xref> e.g. Data Librarian, Data Archivist, Data Steward. Transparency Agents also advocate, promote and demonstrate particular behaviours and good practices, which over time will lead to culture change towards a more Open Science environment. The Data Science roles or positions are key human infrastructure components of the Data Fabric articulated by the Research Data Alliance<xref ref-type="fn" rid="fn3">3</xref> (RDA) and complement the supporting technical infrastructure such as institutional repository platforms and software tools like the Open Science Framework<xref ref-type="fn" rid="fn4">4</xref>. These types of research lifecycle component can be designated as Transparency Tools, and their use and application within research workflows is desirable.</p>
<p>A presentation of transparency terms as &#x201C;transparency standards&#x201D; has been developed by the Center for Open Science as a part of the Open Science Framework and as a modular approach to their TOP Guidelines for Transparency and Openness promotion in Journal Policies and Practices (<xref ref-type="bibr" rid="r4">Center for Open Science, 2015</xref>). These transparency standards include 1. Citation, 2. Data transparency, 3. Analytic methods (code) transparency, 4. Research materials transparency, 5. Design and analysis transparency, 6. Preregistration of studies, 7. Preregistration of analysis plans and 8. Replication. Templates are provided for three levels for each standard, to assist with the common expression of research practices for journal publications.</p>
</sec>
<sec id="s5">
<title>5. Implications for Library and Information Services</title>
<p>Academic library and information services (LIS) are currently tackling the diverse challenges of data curation, research data management and the provision of more extensive research support services. Recent international studies of research data services have identified a range of issues associated with these developments (<xref ref-type="bibr" rid="r8">Corrall, Kennan, &#x0026; Afzal, 2013</xref>; <xref ref-type="bibr" rid="r40">Si, Xing, Zhuang, Hua, &#x0026; Zhou, 2015</xref>). LIS have a key role in engaging and contributing to different stages in the research lifecycle and there are also critical implications for LIS education. In this paper, four broad areas of opportunity for LIS engagement and action on transparency in open science are highlighted. Questions such as &#x2018;How does this trend impact on library and institutional policy?&#x2019;, &#x2018;What new library services might be developed?&#x2019; and &#x2018;How should iSchools augment their educational offerings to encompass transparency concepts?&#x2019; will be explored.</p>
<sec id="s5a">
<title>5.1 Institutional Research Policy and Library Leadership Opportunities</title>
<p>Many academic institutions have a Research Policy or Research Code of Practice which states the principles, ethical foundations and expectations of researcher behaviour within that institution. These types of document may cover aspects of Open Science which correspond to the dimensions of the 3D Model. For example, a Research Policy may describe scholarly publication channels and include commentary on Open Access (OA) journals and institutional OA funds. The Policy may have some narrative regarding participation, inclusivity and academic inter-relationships with the public; it may explicitly support citizen science collaborations. Furthermore, the Research Policy may have clauses relating to Research Quality and Research Integrity in broad terms. Building on this point, LIS senior managers can highlight transparency and reproducibility issues and ensure that institutional policy developments in OA and research data management, reflect the third dimension of open science, through requirements for transparent science processes, methodologies and peer review. In this way, LIS can lead on the inclusion of transparency principles as part of institutional policy.</p>
</sec>
<sec id="s5b">
<title>5.2 Advocacy and Training for Researchers</title>
<p>The increasing data volumes generated from high-throughput devices such as sequencers, computational analysis, large-scale simulations and expanding collections of observational and environmental sensor data have led to the emergence of a new field of Data Science. There are a range of new roles associated with this field which encompass data analysis, stewardship, software engineering, journalism, managing data archives and data librarianship (<xref ref-type="bibr" rid="r25">Lyon &#x0026; Brenner, 2015</xref>). Some academic libraries are extending their existing Research Support Services to include Research Data Management as a key component with advocacy and advisory services (<xref ref-type="bibr" rid="r9">Cox &#x0026; Pinfield, 2014</xref>). Librarians can act as a transparency advocate with faculty by advising on open (transparent) scholarship, reproducible methods and validation approaches. Raising the awareness of new-entrant researchers, providing transparency and reproducibility information, tools and training, are opportunities for libraries to further demonstrate their value and reach.</p>
</sec>
<sec id="s5c">
<title>5.3 Research Infrastructures</title>
<p>National and academic libraries are making significant progress towards sustainable digital stewardship. This goal involves supporting all of the stages of the research lifecycle: designing research protocols and project planning, developing data management or sharing plans; creating, collecting or locating data including metadata descriptions and the use of logs/records and electronic laboratory notebooks; processing data, including cleaning and integrity checks; analysing data, including statistical analysis and visualization; storing and publishing data including through deposit in a repository for long-term preservation; provenance and version control; data peer review and linking to journal articles; managing access to the data with licenses and rights documentation; re-using data via persistent identifiers; and data citation and data attribution metrics. A range of data tools are appearing that address particular aspects of the research data lifecycle such as the DMPTool<xref ref-type="fn" rid="fn5">5</xref> for data management planning and ImpactStory<xref ref-type="fn" rid="fn6">6</xref> for collecting impact evidence and metrics. The Open Science Center has launched a tool called the Open Science Framework (OSF), which is positioned as an open and collaborative project management tool. OSF aims to integrate with other data workflow and research infrastructure components (e.g. data repositories such as figshare<xref ref-type="fn" rid="fn7">7</xref>), and thereby increase transparency in the practice of science. Other examples of research infrastructures which support transparency are open source code hubs, open workflow tools, open repositories for data and textual publications, open lab notebooks and open discussions spaces and forums. Libraries can adopt infrastructure which supports open protocols and processes, create new library services around this open infrastructure and ensure that library curation workflows support transparency.</p>
</sec>
<sec id="s5d">
<title>5.4 Workforce Development</title>
<p>The need to re-engineer LIS education to deliver a curriculum suited to the new data science roles has been noted (<xref ref-type="bibr" rid="r25">Lyon &#x0026; Brenner, 2015</xref>). iSchool programs and courses require real-world relevance in order to produce work-ready graduates who can assume one of these new data science positions. An analysis of the educational requirements, skills, knowledge, and competencies from recent job descriptions for data librarians, data archivists and data stewards has identified the range of themes and topics in scope (<xref ref-type="bibr" rid="r26">Lyon, Mattern, Acker, &#x0026; Langmead, n.d.</xref>). Many iSchools now have data curation or digital stewardship courses, but are transparency and reproducibility concepts embedded in the curriculum? The School of Information Sciences (iSchool) University of Pittsburgh MLIS Program is adopting an innovative translational data science approach (&#x2018;<italic>the transition of data skills, software tools and research intelligence from the iSchool to the marketplace</italic>&#x2019; defined in <xref ref-type="bibr" rid="r25">Lyon and Brenner, 2015</xref>), which mirrors the established terminology of translational medicine. New Masters courses in Research Data Management and Research Data Infrastructures address transparency, reproducibility and validation concepts. The aim is to produce transparency-savvy LIS graduates and to upskill current LIS staff for these new data science roles.</p>
</sec>
</sec>
<sec id="s6">
<title>6. Summary and Future Work</title>
<p>This short paper has begun to explore the emerging narrative associated with research transparency and has particularly focused on the implications and opportunities for libraries. The motivations for addressing transparency as a concept have been articulated and the development of open science policy which embraces transparency principles, has been described. The importance of transparency within research practice and associated scholarly communications, has been highlighted. Within this area, the current &#x201C;retraction epidemic&#x201D; has been noted together with new initiatives to assess reproducibility and replicability of key studies in certain domains. Prior models and expositions of open science have been described and in particular, a two-dimensional continuum approach has been cited as a basis for the 3D-model of Open Science presented in this paper, which has a transparency dimension as an additional axis. A foundational series of terms associated with transparency has been proposed and situated within a research lifecycle. The implications for libraries and Information services have been explored and four broad areas of potential opportunity have been identified.</p>
<p>However, there is much work still to be carried out to realise the full promise of open science. At the practice level, a more detailed analysis and exposition of the ways in which transparency can be achieved, in terms of specific actions or interventions by particular transparency agents throughout the research lifecycle, is needed. The TOP Guidelines go some way towards this goal, but the role of libraries and librarians has not been addressed. As a potentially critical &#x2018;Transparency Agent&#x2019;, a data librarian may be able to advocate, train, guide and support researchers in following recommended transparency standards.</p>
<p>The question of how will libraries and librarians react to this new policy and practice objective of &#x2018;transparency&#x2019; in open science and open data, remains to be investigated. It is acknowledged that there is a need for further research into transparency perspectives and perceptions, as well as into curriculum development and graduate education in this context. To this end, a new strand of qualitative research at the iSchool, University of Pittsburgh, is investigating the attitudes, awareness and activities of academic librarians towards research transparency and open science. It is hoped that the findings will inform future directions for institutional data policy, research data services and educational curriculum development.</p>
</sec>
</body>
<back>
<ack>
<title>Acknowledgement</title>
<p>This paper was accepted and presented at the LIBER 2015 conference in London.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="r1"><mixed-citation>American Meteorological Society. (2013). <italic>Full and open access to data.</italic> Retrieved January 19, 2015, from <ext-link ext-link-type="uri" xlink:href="https://www2.ametsoc.org/ams/index.cfm/about-ams/ams-statements/statements-of-the-ams-in-force/full-and-open-access-to-data/">https://www2.ametsoc.org/ams/index.cfm/about-ams/ams-statements/statements-of-the-ams-in-force/full-and-open-access-to-data/</ext-link>.</mixed-citation></ref>
<ref id="r2"><mixed-citation>Borgman, C.L. (2015). <italic>Big data, little data, no data: Scholarship in the networked world</italic>. Cambridge, MA: MIT Press.</mixed-citation></ref>
<ref id="r3"><mixed-citation>Carey, B. (2011, November 2). Fraud case seen as red flag for psychology research. <italic>The New York Times</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.nytimes.com/2011/11/03/health/research/noted-dutch-psychologist-stapel-accused-of-research-fraud.html?_r=0">http://www.nytimes.com/2011/11/03/health/research/noted-dutch-psychologist-stapel-accused-of-research-fraud.html?_r&#x003D;0</ext-link>.</mixed-citation></ref>
<ref id="r4"><mixed-citation>Center for Open Science. (2015). <italic>Guidelines for transparency and openness promotion (TOP) in journal policies and practices &#x201C;The TOP guidelines&#x201D;.</italic> Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="https://osf.io/ud578/">https://osf.io/ud578/</ext-link>.</mixed-citation></ref>
<ref id="r5"><mixed-citation>Cicerone, R.J. (2015). <italic>Research reproducibility, replicability, reliability</italic> (Speech). Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://www.nasonline.org/about-nas/leadership/president/2015-presidents-address.pdf">http://www.nasonline.org/about-nas/leadership/president/2015-presidents-address.pdf</ext-link>.</mixed-citation></ref>
<ref id="r6"><mixed-citation>Collins, F.S., &#x0026; Tabak, L.A. (2014). NIH plans to enhance reproducibility. <italic>Nature</italic>, <italic>505</italic>, 612&#x2013;613. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.nature.com/polopoly_fs/1.14586!/menu/main/topColumns/topLeftColumn/pdf/505612a.pdf">http://www.nature.com/polopoly_fs/1.14586!/menu/main/topColumns/topLeftColumn/pdf/505612a.pdf</ext-link>.</mixed-citation></ref>
<ref id="r7"><mixed-citation>Corrall, S., &#x0026; Pinfield, S. (2014) Coherence of &#x201C;open&#x201D; initiatives in higher education and research: Framing a policy agenda. In <italic>iConference 2014 Proceedings</italic> (pp. 293&#x2013;313). doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.9776/14085">10.9776/14085</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="https://www.ideals.illinois.edu/bitstream/handle/2142/47316/085_corrVersion.pdf?sequence=5">https://www.ideals.illinois.edu/bitstream/handle/2142/47316/085_corrVersion.pdf?sequence&#x003D;5</ext-link>.</mixed-citation></ref>
<ref id="r8"><mixed-citation>Corrall, S., Kennan, M.A., &#x0026; Afzal, W. (2013). Bibliometrics and research data management services: Emerging trends in library support for research. <italic>Library Trends</italic>, <italic>61</italic>(3), 636&#x2013;674. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://d-scholarship.pitt.edu/18948/1/08_61_3_corrall_636-674.pdf">http://d-scholarship.pitt.edu/18948/1/08_61_3_corrall_636-674.pdf</ext-link>.</mixed-citation></ref>
<ref id="r9"><mixed-citation>Cox, A.M., &#x0026; Pinfield, S. (2014). Research data management and libraries: Current activities and future priorities. <italic>Journal of Librarianship and Information Science</italic>, <italic>46</italic>(4), 299&#x2013;316. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1177/0961000613492542">10.1177/0961000613492542</ext-link>.</mixed-citation></ref>
<ref id="r10"><mixed-citation>Denker, S.P. (2016, January 6). <italic>A proactive approach to reproducibility with evidence-based research on research</italic>. PLOS Blog post January 6 2016. Retrieved 19 January, 2016, from <ext-link ext-link-type="uri" xlink:href="http://blogs.plos.org/plos/2016/01/a-proactive-approach-to-reproducibility-with-evidence-based-research-on-research/">http://blogs.plos.org/plos/2016/01/a-proactive-approach-to-reproducibility-with-evidence-based-research-on-research/</ext-link>.</mixed-citation></ref>
<ref id="r11"><mixed-citation>Easterbrook, S.M. (2014). Open code for open science? <italic>Nature Geoscience</italic>, <italic>7</italic>, 779&#x2013;781. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1038/ngeo2283">10.1038/ngeo2283</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.nature.com/ngeo/journal/v7/n11/pdf/ngeo2283.pdf">http://www.nature.com/ngeo/journal/v7/n11/pdf/ngeo2283.pdf</ext-link>.</mixed-citation></ref>
<ref id="r12"><mixed-citation>Errington, T.M., Iorns, E., Gunn, W., Tan, F.E., Lomax, J., &#x0026; Nosek, B.A. (2014). An open investigation of the reproducibility of cancer biology research. <italic>eLife 2014</italic>, 3, e04333. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.7554/eLife.04333">10.7554/eLife.04333</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4270077/">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4270077/</ext-link>.</mixed-citation></ref>
<ref id="r13"><mixed-citation>Etzioni, A. (2010). Is transparency the best disinfectant? <italic>Journal of Political Philosophy</italic>, <italic>18</italic>(4), 1&#x2013;16. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1111/j.1467-9760.2010.00366.x">10.1111/j.1467-9760.2010.00366.x</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.gwu.edu/~ccps/etzioni/documents/295Transparency.pdf">http://www.gwu.edu/&#x02DC;ccps/etzioni/documents/295Transparency.pdf</ext-link>.</mixed-citation></ref>
<ref id="r15"><mixed-citation>Fang, F.C., Steen R.G., &#x0026; Casadevall, A. (2013). Misconduct accounts for the majority of retracted scientific publications<italic>. Proceedings of the National Academy of Sciences of the United States of America</italic>, <italic>109</italic>(42), 17028&#x2013;17033. Retrieved September 10, 2015, from doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1073/pnas.1212247109">10.1073/pnas.1212247109</ext-link>.</mixed-citation></ref>
<ref id="r16"><mixed-citation>FASEB. (2016, January 14). <italic>Enhancing research reproducibility: Recommendations from the Federation of American Societies for Experimental Biology</italic>. Retrieved January 19, 2016, from: <ext-link ext-link-type="uri" xlink:href="http://www.faseb.org/Portals/2/PDFs/opa/2016/FASEB_Enhancing%20Research%20Reproducibility.pdf">http://www.faseb.org/Portals/2/PDFs/opa/2016/FASEB_Enhancing&#x0025;20Research&#x0025;20Reproducibility.pdf</ext-link>.</mixed-citation></ref>
<ref id="r17"><mixed-citation>Gezelter, J.D. (n.d.). <italic>Open science and verifiability</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://web.stanford.edu/~vcs/Nov21/dg-OpenScienceandVerifiability.pdf">http://web.stanford.edu/&#x02DC;vcs/Nov21/dg-OpenScienceandVerifiability.pdf</ext-link>.</mixed-citation></ref>
<ref id="r18"><mixed-citation>Gov.UK. (2013). <italic>G8 Open data charter and technical annex</italic>. Retrieved January 19, 2016, from: <ext-link ext-link-type="uri" xlink:href="https://www.gov.uk/government/publications/open-data-charter/g8-open-data-charter-and-technical-annex">https://www.gov.uk/government/publications/open-data-charter/g8-open-data-charter-and-technical-annex</ext-link>.</mixed-citation></ref>
<ref id="r19"><mixed-citation>Holdren, J.P., Orszag, P., &#x0026; Prouty, P.F. (2009, February 24). <italic>President&#x2019;s memorandum on transparency and open government &#x2013; Interagency collaboration.</italic> Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="https://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_fy2009/m09-12.pdf">https://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_fy2009/m09-12.pdf</ext-link>.</mixed-citation></ref>
<ref id="r20"><mixed-citation>ICSU. (2015). <italic>Open data in a big data world. An international accord</italic>. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://www.icsu.org/science-international/accord/open-data-in-a-big-data-world-long">http://www.icsu.org/science-international/accord/open-data-in-a-big-data-world-long</ext-link>.</mixed-citation></ref>
<ref id="r21"><mixed-citation>Ince, D. (2011) The Duke University scandal &#x2013; what can be done? <italic>Significance</italic>, <italic>8</italic>(3), 113&#x2013;115. Retrieved September 10, 2015, from doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1111/j.1740-9713.2011.00505.x">10.1111/j.1740-9713.2011.00505.x</ext-link>.</mixed-citation></ref>
<ref id="r22"><mixed-citation>Lauer, M. (2015, October 30). <italic>Bolstering trust in science through rigorous standards.</italic> NIH Open Mike Blog post. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="https://nexus.od.nih.gov/all/2015/10/30/bolstering-trust-in-science-through-rigorous-standards/">https://nexus.od.nih.gov/all/2015/10/30/bolstering-trust-in-science-through-rigorous-standards/</ext-link>.</mixed-citation></ref>
<ref id="r23"><mixed-citation>Lyon, L. (2009). <italic>Open science at web-scale: Optimising participation and predictive potential consultative report</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://opus.bath.ac.uk/30056/1/open-science-report-6nov09-final-sentojisc.pdf">http://opus.bath.ac.uk/30056/1/open-science-report-6nov09-final-sentojisc.pdf</ext-link>.</mixed-citation></ref>
<ref id="r24"><mixed-citation>Lyon, L., &#x0026; Beaton, B. (2015). Citizen science, open access, open data, and research inclusivity. In <italic>2015 ALISE Annual Conference</italic>, Chicago. Abstract retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="https://alise2015.sched.org/event/a881a3eeb8aac9e003eb3831916bb92a#.VfGyBZfWyio">https://alise2015.sched.org/event/a881a3eeb8aac9e003eb3831916bb92a&#x0023;.VfGyBZfWyio</ext-link>.</mixed-citation></ref>
<ref id="r25"><mixed-citation>Lyon, L., &#x0026; Brenner, A. (2015). Bridging the data talent gap: Positioning the iSchool as an agent for change. <italic>International Journal of Digital Curation</italic> <italic>10</italic>(1), 111&#x2013;122. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.2218/ijdc.v10i1.349">10.2218/ijdc.v10i1.349</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.ijdc.net/index.php/ijdc/article/viewFile/10.1.111/384">http://www.ijdc.net/index.php/ijdc/article/viewFile/10.1.111/384</ext-link></mixed-citation></ref>
<ref id="r26"><mixed-citation>Lyon, L., Mattern, E., Acker, A., &#x0026; Langmead, A. (n.d.) <italic>Applying translational principles to data science curriculum development.</italic> In: iPReS 2015 Conference, Chapel Hill, North Carolina. (To be published in the Phaidra Repositry).</mixed-citation></ref>
<ref id="r27"><mixed-citation>Marcus, A., &#x0026; Oransky, I. (2012, August 1). Bring on the Transparency Index. <italic>The Scientist</italic>, 24. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.the-scientist.com/?articles.view/articleNo/32427/title/Bring-On-the-Transparency-Index/">http://www.the-scientist.com/?articles.view/articleNo/32427/title/Bring-On-the-Transparency-Index/</ext-link>.</mixed-citation></ref>
<ref id="r28"><mixed-citation>Mayernik, M.S., Callaghan, S., Leigh, R., Tedds, J., &#x0026; Worley, S. (2014). Peer review of datasets: When, why and how. <italic>Bulletin of the American Meteorological Society</italic>, <italic>96</italic>,191&#x2013;201. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1175/BAMS-D-13-00083.1">10.1175/BAMS-D-13-00083.1</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-13-00083.1">http://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-13-00083.1</ext-link>.</mixed-citation></ref>
<ref id="r29"><mixed-citation>NIH. (2015a, June 9). <italic>Enhancing reproducibility through rigor and transparency</italic>. Notice Number NOT-OD-15-103. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://grants.nih.gov/grants/guide/notice-files/NOT-OD-15-103.html">http://grants.nih.gov/grants/guide/notice-files/NOT-OD-15-103.html</ext-link>.</mixed-citation></ref>
<ref id="r30"><mixed-citation>NIH. (2015b, November 2) <italic>Rigor and reproducibility</italic>. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://grants.nih.gov/reproducibility/index.htm">http://grants.nih.gov/reproducibility/index.htm</ext-link>.</mixed-citation></ref>
<ref id="r31"><mixed-citation>NIH. (2016, January 28). <italic>Scientific rigor in NIH grant applications</italic>. Blog post by Mike Lauer. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://nexus.od.nih.gov/all/2016/01/28/scientific-rigor-in-nih-grant-applications/">http://nexus.od.nih.gov/all/2016/01/28/scientific-rigor-in-nih-grant-applications/</ext-link>.</mixed-citation></ref>
<ref id="r32"><mixed-citation>Nosek, B.A. (2012). An open, large-scale collaborative effort to estimate the reproducibility of psychological science. <italic>Perspectives on Psychological Science</italic>, <italic>7</italic>(6), 657&#x2013;660. doi: <ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1177/1745691612462588">10.1177/1745691612462588</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://pps.sagepub.com/content/7/6/657.full.pdf+html">http://pps.sagepub.com/content/7/6/657.full.pdf&#x002B;html</ext-link>.</mixed-citation></ref>
<ref id="r33"><mixed-citation>Nosek, B.A., &#x0026; Lakens, D. (2014). Registered reports: A method to increase the credibility of published results [Editorial]. <italic>Social Psychology</italic>, <italic>45</italic>(3), 137&#x2013;141. Retrieved September 10, 2015, from doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1027/1864-9335/a000192">10.1027/1864-9335/a000192</ext-link>.</mixed-citation></ref>
<ref id="r34"><mixed-citation>OECD. (2007). <italic>OECD principles and guidelines for access to research data from public funding</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.oecd.org/sti/sci-tech/38500813.pdf">http://www.oecd.org/sti/sci-tech/38500813.pdf</ext-link>.</mixed-citation></ref>
<ref id="r35"><mixed-citation>OECD. (2015). <italic>Making open science a reality</italic>. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="https://www.innovationpolicyplatform.org/content/open-science">https://www.innovationpolicyplatform.org/content/open-science</ext-link>.</mixed-citation></ref>
<ref id="r36"><mixed-citation>Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. <italic>Science</italic>, <italic>349</italic>(6251), aac4716. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1126/science.aac4716">10.1126/science.aac4716</ext-link>. Retrieved January 19, 2016, from <ext-link ext-link-type="uri" xlink:href="http://science.sciencemag.org/content/349/6251/aac4716.full">http://science.sciencemag.org/content/349/6251/aac4716.full</ext-link>.</mixed-citation></ref>
<ref id="r37"><mixed-citation>Peng, R.D. (2011). Reproducible research in computational science. <italic>Science</italic>, <italic>334</italic>(6060), 1226&#x2013;1227. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1126/science.1213847">10.1126/science.1213847</ext-link>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.sciencemag.org/content/334/6060/1226.full.pdf">http://www.sciencemag.org/content/334/6060/1226.full.pdf</ext-link>.</mixed-citation></ref>
<ref id="r38"><mixed-citation>RCUK. (2015). RCUK Common Principles on Data Policy. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.rcuk.ac.uk/research/DataPolicy/">http://www.rcuk.ac.uk/research/DataPolicy/</ext-link>.</mixed-citation></ref>
<ref id="r39"><mixed-citation>Royal Society Report. (2012). <italic>Science as an open enterprise: open data for open science</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="https://royalsociety.org/~/media/Royal_Society_Content/policy/projects/sape/2012-06-20-SAOE.pdf">https://royalsociety.org/&#x02DC;/media/Royal_Society_Content/policy/projects/sape/2012-06-20-SAOE.pdf</ext-link>.</mixed-citation></ref>
<ref id="r40"><mixed-citation>Si, L., Xing, W., Zhuang, X., Hua, X., &#x0026; Zhou, L. (2015) Investigation and analysis of research data services in university libraries. <italic>The Electronic Library</italic>, <italic>33</italic>(3), 417&#x2013;449. doi:<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1108/EL-07-2013-0130">10.1108/EL-07-2013-0130</ext-link>.</mixed-citation></ref>
<ref id="r41"><mixed-citation>Stodden, V. (2009). <italic>Enabling reproducible research: open licensing for scientific innovation</italic>. Social Science Research Network. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1362040">http://papers.ssrn.com/sol3/papers.cfm?abstract_id&#x003D;1362040</ext-link>.</mixed-citation></ref>
<ref id="r42"><mixed-citation>Stodden, V., Bailey, D.H., Borwein, R.J., LeVeque, W.R., Rider, W., &#x0026; Stein, W. (2013). <italic>Setting the default to reproducible: Reproducibility in computational and experimental mathematics.</italic> ICERM Workshop December 10&#x2013;14, 2012, Providence. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://stodden.net/icerm_report.pdf">http://stodden.net/icerm_report.pdf</ext-link>.</mixed-citation></ref>
<ref id="r43"><mixed-citation>Stodden, V., Leisch F., &#x0026; Peng, R.D. (2014). <italic>Implementing reproducible research</italic>. Boca Raton, FL: Chapman &#x0026; Hall/CRC Press. Available as an eBook.</mixed-citation></ref>
<ref id="r44"><mixed-citation>Taylor, B.N., &#x0026; Kuyatt, C.E. (1994). <italic>Guidelines for evaluating and expressing the uncertainty of NIST measurement results</italic>. NIST Technical Note 1297. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://physics.nist.gov/Pubs/guidelines/TN1297/tn1297s.pdf">http://physics.nist.gov/Pubs/guidelines/TN1297/tn1297s.pdf</ext-link>.</mixed-citation></ref>
<ref id="r45"><mixed-citation>The Economist. (2010, May 18). <italic>The science of climate change:</italic> <italic>The clouds of unknowing.</italic> Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.economist.com/node/15719298">http://www.economist.com/node/15719298</ext-link>.</mixed-citation></ref>
<ref id="r46"><mixed-citation>The Economist. (2013, October 17). <italic>Unreliable research: Trouble at the lab</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble">http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble</ext-link>.</mixed-citation></ref>
<ref id="r47"><mixed-citation>University of Bath. (2014). <italic>Research Data Policy</italic>. Retrieved September 10, 2015, from <ext-link ext-link-type="uri" xlink:href="http://www.bath.ac.uk/research/data/policy/research-data-policy.html">http://www.bath.ac.uk/research/data/policy/research-data-policy.html</ext-link>.</mixed-citation></ref>
</ref-list>
<fn-group>
<fn id="fn1"><p>Science Exchange Validation Service <ext-link ext-link-type="uri" xlink:href="http://validation.scienceexchange.com/#/">http://validation.scienceexchange.com/&#x0023;/</ext-link>.</p></fn>
<fn id="fn2"><p><ext-link ext-link-type="uri" xlink:href="https://publons.com/journal/530/gigascience">https://publons.com/journal/530/gigascience</ext-link>.</p></fn>
<fn id="fn3"><p>Research Data Alliance <ext-link ext-link-type="uri" xlink:href="https://rd-alliance.org/">https://rd-alliance.org/</ext-link>.</p></fn>
<fn id="fn4"><p>Open Science Framework <ext-link ext-link-type="uri" xlink:href="https://osf.io/">https://osf.io/</ext-link>.</p></fn>
<fn id="fn5"><p>DMPTool <ext-link ext-link-type="uri" xlink:href="https://dmptool.org/">https://dmptool.org/</ext-link>.</p></fn>
<fn id="fn6"><p>ImpactStory <ext-link ext-link-type="uri" xlink:href="https://impactstory.org/">https://impactstory.org/</ext-link>.</p></fn>
<fn id="fn7"><p>Figshare <ext-link ext-link-type="uri" xlink:href="http://figshare.com/">http://figshare.com/</ext-link>.</p></fn>
</fn-group>
</back>
</article>