IASSIST Quarterly https://www.iassistquarterly.com/index.php/iassist <p class="p1">The <strong>IASSIST Quarterly</strong> at https://iassistquarterly.com is a peer-reviewed, indexed, open access quarterly publication of articles dealing with social science information and data services, including relevant societal, legal, and ethical issues.</p> <p class="p1">The <strong>IASSIST Quarterly</strong> represents an international cooperative effort on the part of individuals managing, operating, or using machine-readable data archives, data libraries, and data services. The&nbsp;<strong>IASSIST Quarterly </strong>reports on activities related to the production, acquisition, preservation, processing, distribution, and use of machine-readable data carried out by its members and others in the international social science community.&nbsp;</p> International Association for Social Science Information Service and Technology en-US IASSIST Quarterly 0739-1137 <p>"This license lets others remix, tweak, and build upon your work non-commercially, and although their new works must also acknowledge you and be non-commercial, they don’t have to license their derivative works on the same terms."<br><a href="https://creativecommons.org/licenses/by-nc/4.0/"><br><img src="/public/site/images/ojsadmin/ccbync2.png"><br></a></p> Countries closing down - reproducibility keeping science open https://www.iassistquarterly.com/index.php/iassist/article/view/981 <p>Welcome to volume 44 of the <em>IASSIST Quarterly</em>. Here in 2020 we start with a double issue on reproducibility (IQ 44(1-2)).</p> <p>The start of 2020 was in the sign of Corona. Though we are now only in the middle of the year, we can say with confidence that 2020 will be known for the closing down of nearly all public life. From our very own world this included the move of the IASSIST 2020 conference to 2021. The closing down of societies took different forms and this will and should be long debated and investigated, because many civil rights in open society were put on instant standby by governments, with various precautionary measures. Fortunately, many countries are now in the processes of opening up. Hopefully, we are now more careful, keeping socially distant, executing better sanitation, etc. We are also eagerly expectant of science breakthroughs: the vaccine, the better treatment, the cure. But Corona science extends beyond health and biology. Social science in particular has an obligation to make us better prepared to take necessary measures and to uphold democracy.&nbsp;&nbsp; &nbsp;</p> <p>Social science has always had the reliable issue that you cannot step into the same river twice: Survey data collected at one time will not in a subsequent data collection bring the same results, even with the same panel of respondents. Reproducibility has many more forms than exact data collection, though, and is foundational for open science and an open society. Science needs to be transparent in order to be challenged and improved. Fellow scientists as well as laymen should have the possibility of performing analyses to find whether results can be reproduced.</p> <p>I am therefore very happy to send my thanks to Harrison Dekker and Amy Riegelman for taking the initiative to create this special issue of the <em>IASSIST Quarterly</em> on reproducibility. Harrison Dekker is a data librarian at University of Rhode Island and Amy Riegelman a librarian in social sciences at the University of Minnesota. Together, Amy and Harrison reviewed the papers submitted for their special issue and wrote the introduction in the following pages. In addition to expressing my great appreciation to them, I also want to thank all the authors who submitted papers for this issue.</p> <p>Thanks! Let's keep science open again!</p> <p>Submissions of papers for the <em>IASSIST Quarterly</em> are always very welcome. We welcome input from IASSIST conferences or other conferences and workshops, from local presentations or papers especially written for the <em>IQ</em>. When you are preparing such a presentation, give a thought to turning your one-time presentation into a lasting contribution. Doing that after the event also gives you the opportunity of improving your work after feedback. We encourage you to login or create an author login to <a href="https://www.iassistquarterly.com">https://www.iassistquarterly.com</a> &nbsp;(our Open Journal System application). We permit authors 'deep links' into the <em>IQ</em> as well as deposition of the paper in your local repository. Chairing a conference session with the purpose of aggregating and integrating papers for a special issue <em>IQ</em> is also much appreciated as the information reaches many more people than the limited number of session participants and will be readily available on the <em>IASSIST Quarterly</em> website at <a href="https://www.iassistquarterly.com">https://www.iassistquarterly.com</a>. &nbsp;Authors are very welcome to take a look at the instructions and layout:</p> <p><a href="https://www.iassistquarterly.com/index.php/iassist/about/submissions">https://www.iassistquarterly.com/index.php/iassist/about/submissions</a></p> <p>Authors can also contact me directly via e-mail: <a href="mailto:kbr@sam.sdu.dk">kbr@sam.sdu.dk</a>. &nbsp;Should you be interested in compiling a special issue for the <em>IQ</em> as guest editor(s) I will also be delighted to hear from you.</p> <p>Karsten Boye Rasmussen - June 2020</p> Karsten Boye Rasmussen Copyright (c) 2020 Karsten Boye Rasmussen http://creativecommons.org/licenses/by-nc/4.0 2020-07-02 2020-07-02 44 1-2 1 2 10.29173/iq981 Advocating for reproducibility https://www.iassistquarterly.com/index.php/iassist/article/view/982 <p>As guest editors, we are excited to publish this special double issue of<em> IASSIST Quarterly. </em>The topics of reproducibility, replicability, and transparency have been addressed in past issues of <em>IASSIST Quarterly </em>and at the IASSIST conference, but this double issue is entirely focused on these issues.</p> <p>In recent years, efforts “to improve the credibility of science by advancing transparency, reproducibility, rigor, and ethics in research” have gained momentum in the social sciences (Center for Effective Global Action, 2020). While few question the spirit of the reproducibility and research transparency movement, it faces significant challenges because it goes against the grain of established practice.</p> <p>We believe the data services community is in a unique position to help advance this movement given our data and technical expertise, training and consulting work, international scope, and established role in data management and preservation, and more. As evidence of the movement, several initiatives exist to support research reproducibility infrastructure and data preservation efforts:</p> <ul> <li><a href="https://osf.io/">Center for Open Science (COS) / Open Science Framework (OSF)[i]</a></li> <li><a href="https://www.bitss.org/">Berkeley Initiative for Transparency in the Social Sciences</a><a href="https://www.bitss.org/"> (BITSS)</a><a href="#_edn2" name="_ednref2">[ii]</a></li> <li><a href="http://cure.web.unc.edu/">CUrating for REproducibility (CURE)[iii]</a></li> <li><a href="https://www.projecttier.org/">Project Tier[iv]</a></li> <li><a href="https://datacurationnetwork.org/">Data Curation Network[v]</a></li> <li><a href="https://ukrn.org/">UK Reproducibility Network[vi]</a></li> </ul> <p>While many new initiatives have launched in recent years, prior to the now commonly used phrase “reproducibility crisis” and Ioannidis publishing the essay, “Why Most Published Research Findings are False,” we know that the data services community was supporting reproducibility in a variety of ways (e.g., data management, data preservation, metadata standards) in wellestablished consortiums such as Inter-university Consortium for Political and Social Research (ICPSR) (Ioannidis, 2005).</p> <p>The articles in this issue comprise several very important aspects of reproducible research:</p> <ul> <li>Identification of barriers to reproducibility and solutions to such barriers</li> <li>Evidence synthesis as related to transparent reporting and reproducibility</li> <li>Reflection on how information professionals, researchers, and librarians perceive the reproducibility crisis and how they can partner to help solve it.</li> </ul> <p>The issue begins with “Reproducibility literature analysis” which looks at existing resources and literature to identify barriers to reproducibility and potential solutions. The authors have compiled a comprehensive list of resources with annotations that include definitions of key concepts pertinent to the reproducibility crisis.</p> <p>The next article addresses data reuse from the perspective of a large research university. The authors examine instances of both successful and failed data reuse instances and identify best practices for librarians interested in conducting research involving the common forms of data collected in an academic library.</p> <p>Systematic reviews are a research approach that involves the quantitative and/or qualitative synthesis of data collected through a comprehensive literature review.&nbsp; “Methods reporting that supports reader confidence for systematic reviews in psychology” looks at the reproducibility of electronic literature searches reported in psychology systematic reviews.</p> <p>A fundamental challenge in reproducing or replicating computational results is the need for researchers to make available the code used in producing these results. But sharing code and having it to run correctly for another user can present significant technical challenges. In “Reproducibility, preservation, and access to research with Reprozip, Reproserver” the authors describe open source software that they are developing to address these challenges.&nbsp;</p> <p>Taking a published article and attempting to reproduce the results, is an exercise that is sometimes used in academic courses to highlight the inherent difficulty of the process. The final article in this issue, “ReprohackNL 2019: How libraries can promote research reproducibility through community engagement” describes an innovative library-based variation to this exercise.</p> <p>&nbsp;</p> <p>Harrison Dekker, Data Librarian, University of Rhode Island</p> <p>Amy Riegelman, Social Sciences Librarian, University of Minnesota</p> <p>&nbsp;</p> <p><strong>References </strong></p> <p>Center for Effective Global Action (2020), <em>About the Berkeley Initiative for Transparency in the Social Sciences</em>. Available at: <a href="https://www.bitss.org/about">https://www.bitss.org/about</a> (accessed 23 June 2020).</p> <p>Ioannidis, J.P. (2005) ‘Why most published research findings are false’, <em>PLoS Medicine</em>, 2(8), p. e124.&nbsp; doi:&nbsp; <a href="https://doi.org/10.1371/journal.pmed.0020124">https://doi.org/10.1371/journal.pmed.0020124</a></p> <p>&nbsp;</p> <p><a href="#_ednref1" name="_edn1">[i]</a> https://osf.io</p> <p><a href="#_ednref2" name="_edn2">[ii]</a> https://www.bitss.org/</p> <p><a href="#_ednref3" name="_edn3">[iii]</a> http://cure.web.unc.edu</p> <p><a href="#_ednref4" name="_edn4">[iv]</a> https://www.projecttier.org/</p> <p><a href="#_ednref5" name="_edn5">[v]</a> https://datacurationnetwork.org/</p> <p><a href="#_ednref6" name="_edn6">[vi]</a> https://ukrn.org</p> Harrison Dekker Amy Riegelman Copyright (c) 2020 Harrison Dekker, Amy Riegelman http://creativecommons.org/licenses/by-nc/4.0 2020-07-02 2020-07-02 44 1-2 1 2 10.29173/iq982 Reproducibility literature analysis - a federal information professional perspective https://www.iassistquarterly.com/index.php/iassist/article/view/967 <p style="margin: 0px 0px 8px;"><span lang="EN" style="margin: 0px; font-family: 'Calibri',sans-serif;"><span style="font-size: medium;">This article examines a cross-section of literature and other resources to reveal common reproducibility issues faced by stakeholders regardless of subject area or focus. We identify a variety of issues named as reproducibility barriers, the solutions to such barriers, and reflect on how researchers and information professionals can act to address the ‘reproducibility crisis.’ The finished products of this work include an annotated list of 122 published resources and a primer that identifies and defines key concepts from the resources that contribute to the crisis. </span></span></p> Erin Antognoli Regina L. Avila Jonathan Sears Leighton L. Christiansen Jessica Tieman Jacquelyn Hart Copyright (c) 2020 Erin Antognoli, Regina L. Avila, Jonathan Sears, Leighton L. Christiansen, Jessica Tieman, Jacquelyn Hart http://creativecommons.org/licenses/by-nc/4.0 2020-06-29 2020-06-29 44 1-2 1 26 10.29173/iq967 Learning from data reuse: successful and failed experiences in a large public research university library https://www.iassistquarterly.com/index.php/iassist/article/view/966 <p>This paper illustrates a large research university library experience in reusing the data for research collected both within and outside of the library to demonstrate data reuse practice. The purpose of the paper is to 1)&nbsp;demonstrate when and how data are reused in a large public research university library, 2) share tips on what to consider when reusing data, and 3) share challenges and lessons learned from data reuse experiences. This paper presents five proposed opportunities for data reuse conducted by three researchers at the institution’s library which resulted in three successful instances of data reuses and two failed data reuses. Learning from successful and failed experiences is critical to understand what works and what does not work in order to identify best practices for data reuse. This paper will be helpful for librarians who intend to reuse data for publication.</p> Jung Mi Scoulas Sandra L. De Groote Paula R. Dempsey Copyright (c) 2020 Jung Mi Scoulas http://creativecommons.org/licenses/by-nc/4.0 2020-06-29 2020-06-29 44 1-2 1 15 10.29173/iq966 Methods reporting that supports reader confidence for systematic reviews in psychology: assessing the reproducibility of electronic searches and first-level screening decisions. https://www.iassistquarterly.com/index.php/iassist/article/view/968 <p>Recent discussions and research in&nbsp;psychology&nbsp;show a significant emphasis on reproducibility. Concerns for reproducibility pertain to&nbsp;methods&nbsp;as well as results. We evaluated the reporting of the electronic search&nbsp;methods&nbsp;used for&nbsp;systematic&nbsp;reviews (SR) published in&nbsp;psychology. Such reports are key for determining the reproducibility of electronic searches. The use of SR has been increasing in&nbsp;psychology, and we report on the status of reporting of electronic searches in recent SR in&nbsp;psychology.</p> <p>We used 12 checklist items to evaluate reporting for basic electronic strategies. Kappa results for those items developed from evidence-based recommendations ranged from fair to almost perfect. Additionally, using a set of those items to represent a “PRISMA” type of recommended reporting showed that only one of the 25 randomly selected&nbsp;psychology&nbsp;SR from 2009-2012 reported recommended information for all items in the set, and none of the 25&nbsp;psychology&nbsp;SR from 2014-2016 did so. Using a second less stringent set of items found that only 36% of the&nbsp;psychology&nbsp;SR reported basic information that supports confidence in the reproducibility of electronic searches. Similar results were found for a set of psychology SR published in 2017.</p> <p>An area for improvements in SR in&nbsp;psychology&nbsp;involves fuller and clearer reporting of the steps used for electronic searches in SR. Such improvements will provide a strong basis for confidence in the reproducibility of searches. That confidence, in turn, can strengthen reader confidence more generally in the results and conclusions reached in SR in&nbsp;psychology.</p> Paul Fehrmann Megan Mamolen Copyright (c) 2020 Paul Fehrmann, Megan Mamolen http://creativecommons.org/licenses/by-nc/4.0 2020-06-29 2020-06-29 44 1-2 1 26 10.29173/iq968 Reproducibility, preservation, and access to research with ReproZip and ReproServer https://www.iassistquarterly.com/index.php/iassist/article/view/969 <p class="western" style="margin-bottom: 0.14in;" align="justify"><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments. To address these technical challenges, we created ReproZip, an open-source tool that automatically packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an </span></span></span></span></span></span></span></span><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Courier New;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">rpz</span></span></span></span></span></span></span></span> <span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">file, which users can use to reproduce the work with ReproZip and a suitable unpacker (e.g.: using Vagrant or Docker). The </span></span></span></span></span></span></span></span><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Courier New;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">rpz</span></span></span></span></span></span></span></span> <span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation. However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool, ReproServer, can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a web application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a persistent link to the unpacked work on ReproServer which they can share with reviewers or colleagues.</span></span></span></span></span></span></span></span></p> Vicky Steeves Rémi Rampin Fernando Chirigati Copyright (c) 2020 Vicky Steeves, Rémi Rampin, Fernando Chirigati http://creativecommons.org/licenses/by-nc/4.0 2020-06-29 2020-06-29 44 1-2 1 11 10.29173/iq969 ReprohackNL 2019: how libraries can promote research reproducibility through community engagement https://www.iassistquarterly.com/index.php/iassist/article/view/977 <p>University Libraries play a crucial role in moving towards Open Science, contributing to more transparent, reproducible and reusable research. The Center for Digital Scholarship (CDS) at Leiden University (LU) library is a scholarly lab that promotes open science literacy among Leiden’s scholars by two complementary strategies: existing top-down structures are used to provide training and services, while bottom-up initiatives from the research community are actively supported by offering the CDS’s expertise and facilities. An example of how bottom-up initiatives can blossom with the help of library structures such as the CDS is ReproHack. ReproHack – a reproducibility hackathon – is a grass-root initiative by young scholars with the goal of improving research reproducibility in three ways. First, hackathon attendees learn about reproducibility tools and challenges by reproducing published results and providing feedback to authors on their attempt. Second, authors can nominate their work and receive feedback on their reproducibility efforts. Third, the collaborative atmosphere helps building a community interested in making their own research reproducible.</p> <p>A first ReproHack in the Netherlands took place on November 30<sup>th</sup>, 2019, co-organised by the CDS at the LU Library with 44 participants from the fields of psychology, engineering, biomedicine, and computer science. For 19 papers, 24 feedback forms were returned and five papers were reported as successfully reproduced. Besides the researchers’ learning experience, the event led to recommendations on how to enhance research reproducibility. The ReproHack format therefore provides an opportunity for libraries to improve scientific reproducibility through community engagement.</p> Kristina Hettne Ricarda Proppert Linda Nab L. Paloma Rojas-Saunero Daniela Gawehns Copyright (c) 2020 Kristina Hettne, Ricarda Proppert, Linda Nab, L. Paloma Rojas-Saunero, Daniela Gawehns http://creativecommons.org/licenses/by-nc/4.0 2020-07-02 2020-07-02 44 1-2 1 10 10.29173/iq977