IASSIST Quarterly 2020-07-06T06:11:24-06:00 Karsten Boye Rasmussen Open Journal Systems <p class="p1">The <strong>IASSIST Quarterly</strong> at is a peer-reviewed, indexed, open access quarterly publication of articles dealing with social science information and data services, including relevant societal, legal, and ethical issues.</p> <p class="p1">The <strong>IASSIST Quarterly</strong> represents an international cooperative effort on the part of individuals managing, operating, or using machine-readable data archives, data libraries, and data services. The&nbsp;<strong>IASSIST Quarterly </strong>reports on activities related to the production, acquisition, preservation, processing, distribution, and use of machine-readable data carried out by its members and others in the international social science community.&nbsp;</p> Reproducibility literature analysis - a federal information professional perspective 2020-07-06T06:11:21-06:00 Erin Antognoli Regina L. Avila Jonathan Sears Leighton L. Christiansen Jessica Tieman Jacquelyn Hart <p style="margin: 0px 0px 8px;"><span lang="EN" style="margin: 0px; font-family: 'Calibri',sans-serif;"><span style="font-size: medium;">This article examines a cross-section of literature and other resources to reveal common reproducibility issues faced by stakeholders regardless of subject area or focus. We identify a variety of issues named as reproducibility barriers, the solutions to such barriers, and reflect on how researchers and information professionals can act to address the ‘reproducibility crisis.’ The finished products of this work include an annotated list of 122 published resources and a primer that identifies and defines key concepts from the resources that contribute to the crisis. </span></span></p> 2020-06-29T00:00:00-06:00 Copyright (c) 2020 Erin Antognoli, Regina L. Avila, Jonathan Sears, Leighton L. Christiansen, Jessica Tieman, Jacquelyn Hart Learning from data reuse: successful and failed experiences in a large public research university library 2020-07-06T06:11:22-06:00 Jung Mi Scoulas Sandra L. De Groote Paula R. Dempsey <p>This paper illustrates a large research university library experience in reusing the data for research collected both within and outside of the library to demonstrate data reuse practice. The purpose of the paper is to 1)&nbsp;demonstrate when and how data are reused in a large public research university library, 2) share tips on what to consider when reusing data, and 3) share challenges and lessons learned from data reuse experiences. This paper presents five proposed opportunities for data reuse conducted by three researchers at the institution’s library which resulted in three successful instances of data reuses and two failed data reuses. Learning from successful and failed experiences is critical to understand what works and what does not work in order to identify best practices for data reuse. This paper will be helpful for librarians who intend to reuse data for publication.</p> 2020-06-29T00:00:00-06:00 Copyright (c) 2020 Jung Mi Scoulas Methods reporting that supports reader confidence for systematic reviews in psychology: assessing the reproducibility of electronic searches and first-level screening decisions. 2020-07-06T06:11:22-06:00 Paul Fehrmann Megan Mamolen <p>Recent discussions and research in&nbsp;psychology&nbsp;show a significant emphasis on reproducibility. Concerns for reproducibility pertain to&nbsp;methods&nbsp;as well as results. We evaluated the reporting of the electronic search&nbsp;methods&nbsp;used for&nbsp;systematic&nbsp;reviews (SR) published in&nbsp;psychology. Such reports are key for determining the reproducibility of electronic searches. The use of SR has been increasing in&nbsp;psychology, and we report on the status of reporting of electronic searches in recent SR in&nbsp;psychology.</p> <p>We used 12 checklist items to evaluate reporting for basic electronic strategies. Kappa results for those items developed from evidence-based recommendations ranged from fair to almost perfect. Additionally, using a set of those items to represent a “PRISMA” type of recommended reporting showed that only one of the 25 randomly selected&nbsp;psychology&nbsp;SR from 2009-2012 reported recommended information for all items in the set, and none of the 25&nbsp;psychology&nbsp;SR from 2014-2016 did so. Using a second less stringent set of items found that only 36% of the&nbsp;psychology&nbsp;SR reported basic information that supports confidence in the reproducibility of electronic searches. Similar results were found for a set of psychology SR published in 2017.</p> <p>An area for improvements in SR in&nbsp;psychology&nbsp;involves fuller and clearer reporting of the steps used for electronic searches in SR. Such improvements will provide a strong basis for confidence in the reproducibility of searches. That confidence, in turn, can strengthen reader confidence more generally in the results and conclusions reached in SR in&nbsp;psychology.</p> 2020-06-29T00:00:00-06:00 Copyright (c) 2020 Paul Fehrmann, Megan Mamolen Reproducibility, preservation, and access to research with ReproZip and ReproServer 2020-07-06T06:11:23-06:00 Vicky Steeves Rémi Rampin Fernando Chirigati <p class="western" style="margin-bottom: 0.14in;" align="justify"><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments. To address these technical challenges, we created ReproZip, an open-source tool that automatically packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an </span></span></span></span></span></span></span></span><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Courier New;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">rpz</span></span></span></span></span></span></span></span> <span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">file, which users can use to reproduce the work with ReproZip and a suitable unpacker (e.g.: using Vagrant or Docker). The </span></span></span></span></span></span></span></span><span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Courier New;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">rpz</span></span></span></span></span></span></span></span> <span style="font-variant: normal;"><span style="color: #000000;"><span style="text-decoration: none;"><span style="font-family: Calibri, sans-serif;"><span style="font-size: small;"><span lang="en-US"><span style="font-style: normal;"><span style="font-weight: normal;">file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation. However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool, ReproServer, can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a web application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a persistent link to the unpacked work on ReproServer which they can share with reviewers or colleagues.</span></span></span></span></span></span></span></span></p> 2020-06-29T00:00:00-06:00 Copyright (c) 2020 Vicky Steeves, Rémi Rampin, Fernando Chirigati ReprohackNL 2019: how libraries can promote research reproducibility through community engagement 2020-07-06T06:11:23-06:00 Kristina Hettne Ricarda Proppert Linda Nab L. Paloma Rojas-Saunero Daniela Gawehns <p>University Libraries play a crucial role in moving towards Open Science, contributing to more transparent, reproducible and reusable research. The Center for Digital Scholarship (CDS) at Leiden University (LU) library is a scholarly lab that promotes open science literacy among Leiden’s scholars by two complementary strategies: existing top-down structures are used to provide training and services, while bottom-up initiatives from the research community are actively supported by offering the CDS’s expertise and facilities. An example of how bottom-up initiatives can blossom with the help of library structures such as the CDS is ReproHack. ReproHack – a reproducibility hackathon – is a grass-root initiative by young scholars with the goal of improving research reproducibility in three ways. First, hackathon attendees learn about reproducibility tools and challenges by reproducing published results and providing feedback to authors on their attempt. Second, authors can nominate their work and receive feedback on their reproducibility efforts. Third, the collaborative atmosphere helps building a community interested in making their own research reproducible.</p> <p>A first ReproHack in the Netherlands took place on November 30<sup>th</sup>, 2019, co-organised by the CDS at the LU Library with 44 participants from the fields of psychology, engineering, biomedicine, and computer science. For 19 papers, 24 feedback forms were returned and five papers were reported as successfully reproduced. Besides the researchers’ learning experience, the event led to recommendations on how to enhance research reproducibility. The ReproHack format therefore provides an opportunity for libraries to improve scientific reproducibility through community engagement.</p> 2020-07-02T00:00:00-06:00 Copyright (c) 2020 Kristina Hettne, Ricarda Proppert, Linda Nab, L. Paloma Rojas-Saunero, Daniela Gawehns Countries closing down - reproducibility keeping science open 2020-07-06T06:11:20-06:00 Karsten Boye Rasmussen <p>Welcome to volume 44 of the <em>IASSIST Quarterly</em>. Here in 2020 we start with a double issue on reproducibility (IQ 44(1-2)).</p> <p>The start of 2020 was in the sign of Corona. Though we are now only in the middle of the year, we can say with confidence that 2020 will be known for the closing down of nearly all public life. From our very own world this included the move of the IASSIST 2020 conference to 2021. The closing down of societies took different forms and this will and should be long debated and investigated, because many civil rights in open society were put on instant standby by governments, with various precautionary measures. Fortunately, many countries are now in the processes of opening up. Hopefully, we are now more careful, keeping socially distant, executing better sanitation, etc. We are also eagerly expectant of science breakthroughs: the vaccine, the better treatment, the cure. But Corona science extends beyond health and biology. Social science in particular has an obligation to make us better prepared to take necessary measures and to uphold democracy.&nbsp;&nbsp; &nbsp;</p> <p>Social science has always had the reliable issue that you cannot step into the same river twice: Survey data collected at one time will not in a subsequent data collection bring the same results, even with the same panel of respondents. Reproducibility has many more forms than exact data collection, though, and is foundational for open science and an open society. Science needs to be transparent in order to be challenged and improved. Fellow scientists as well as laymen should have the possibility of performing analyses to find whether results can be reproduced.</p> <p>I am therefore very happy to send my thanks to Harrison Dekker and Amy Riegelman for taking the initiative to create this special issue of the <em>IASSIST Quarterly</em> on reproducibility. Harrison Dekker is a data librarian at University of Rhode Island and Amy Riegelman a librarian in social sciences at the University of Minnesota. Together, Amy and Harrison reviewed the papers submitted for their special issue and wrote the introduction in the following pages. In addition to expressing my great appreciation to them, I also want to thank all the authors who submitted papers for this issue.</p> <p>Thanks! Let's keep science open again!</p> <p>Submissions of papers for the <em>IASSIST Quarterly</em> are always very welcome. We welcome input from IASSIST conferences or other conferences and workshops, from local presentations or papers especially written for the <em>IQ</em>. When you are preparing such a presentation, give a thought to turning your one-time presentation into a lasting contribution. Doing that after the event also gives you the opportunity of improving your work after feedback. We encourage you to login or create an author login to <a href=""></a> &nbsp;(our Open Journal System application). We permit authors 'deep links' into the <em>IQ</em> as well as deposition of the paper in your local repository. Chairing a conference session with the purpose of aggregating and integrating papers for a special issue <em>IQ</em> is also much appreciated as the information reaches many more people than the limited number of session participants and will be readily available on the <em>IASSIST Quarterly</em> website at <a href=""></a>. &nbsp;Authors are very welcome to take a look at the instructions and layout:</p> <p><a href=""></a></p> <p>Authors can also contact me directly via e-mail: <a href=""></a>. &nbsp;Should you be interested in compiling a special issue for the <em>IQ</em> as guest editor(s) I will also be delighted to hear from you.</p> <p>Karsten Boye Rasmussen - June 2020</p> 2020-07-02T00:00:00-06:00 Copyright (c) 2020 Karsten Boye Rasmussen Advocating for reproducibility 2020-07-06T06:11:21-06:00 Harrison Dekker Amy Riegelman <p>As guest editors, we are excited to publish this special double issue of<em> IASSIST Quarterly. </em>The topics of reproducibility, replicability, and transparency have been addressed in past issues of <em>IASSIST Quarterly </em>and at the IASSIST conference, but this double issue is entirely focused on these issues.</p> <p>In recent years, efforts “to improve the credibility of science by advancing transparency, reproducibility, rigor, and ethics in research” have gained momentum in the social sciences (Center for Effective Global Action, 2020). While few question the spirit of the reproducibility and research transparency movement, it faces significant challenges because it goes against the grain of established practice.</p> <p>We believe the data services community is in a unique position to help advance this movement given our data and technical expertise, training and consulting work, international scope, and established role in data management and preservation, and more. As evidence of the movement, several initiatives exist to support research reproducibility infrastructure and data preservation efforts:</p> <ul> <li><a href="">Center for Open Science (COS) / Open Science Framework (OSF)[i]</a></li> <li><a href="">Berkeley Initiative for Transparency in the Social Sciences</a><a href=""> (BITSS)</a><a href="#_edn2" name="_ednref2">[ii]</a></li> <li><a href="">CUrating for REproducibility (CURE)[iii]</a></li> <li><a href="">Project Tier[iv]</a></li> <li><a href="">Data Curation Network[v]</a></li> <li><a href="">UK Reproducibility Network[vi]</a></li> </ul> <p>While many new initiatives have launched in recent years, prior to the now commonly used phrase “reproducibility crisis” and Ioannidis publishing the essay, “Why Most Published Research Findings are False,” we know that the data services community was supporting reproducibility in a variety of ways (e.g., data management, data preservation, metadata standards) in wellestablished consortiums such as Inter-university Consortium for Political and Social Research (ICPSR) (Ioannidis, 2005).</p> <p>The articles in this issue comprise several very important aspects of reproducible research:</p> <ul> <li>Identification of barriers to reproducibility and solutions to such barriers</li> <li>Evidence synthesis as related to transparent reporting and reproducibility</li> <li>Reflection on how information professionals, researchers, and librarians perceive the reproducibility crisis and how they can partner to help solve it.</li> </ul> <p>The issue begins with “Reproducibility literature analysis” which looks at existing resources and literature to identify barriers to reproducibility and potential solutions. The authors have compiled a comprehensive list of resources with annotations that include definitions of key concepts pertinent to the reproducibility crisis.</p> <p>The next article addresses data reuse from the perspective of a large research university. The authors examine instances of both successful and failed data reuse instances and identify best practices for librarians interested in conducting research involving the common forms of data collected in an academic library.</p> <p>Systematic reviews are a research approach that involves the quantitative and/or qualitative synthesis of data collected through a comprehensive literature review.&nbsp; “Methods reporting that supports reader confidence for systematic reviews in psychology” looks at the reproducibility of electronic literature searches reported in psychology systematic reviews.</p> <p>A fundamental challenge in reproducing or replicating computational results is the need for researchers to make available the code used in producing these results. But sharing code and having it to run correctly for another user can present significant technical challenges. In “Reproducibility, preservation, and access to research with Reprozip, Reproserver” the authors describe open source software that they are developing to address these challenges.&nbsp;</p> <p>Taking a published article and attempting to reproduce the results, is an exercise that is sometimes used in academic courses to highlight the inherent difficulty of the process. The final article in this issue, “ReprohackNL 2019: How libraries can promote research reproducibility through community engagement” describes an innovative library-based variation to this exercise.</p> <p>&nbsp;</p> <p>Harrison Dekker, Data Librarian, University of Rhode Island</p> <p>Amy Riegelman, Social Sciences Librarian, University of Minnesota</p> <p>&nbsp;</p> <p><strong>References </strong></p> <p>Center for Effective Global Action (2020), <em>About the Berkeley Initiative for Transparency in the Social Sciences</em>. Available at: <a href=""></a> (accessed 23 June 2020).</p> <p>Ioannidis, J.P. (2005) ‘Why most published research findings are false’, <em>PLoS Medicine</em>, 2(8), p. e124.&nbsp; doi:&nbsp; <a href=""></a></p> <p>&nbsp;</p> <p><a href="#_ednref1" name="_edn1">[i]</a></p> <p><a href="#_ednref2" name="_edn2">[ii]</a></p> <p><a href="#_ednref3" name="_edn3">[iii]</a></p> <p><a href="#_ednref4" name="_edn4">[iv]</a></p> <p><a href="#_ednref5" name="_edn5">[v]</a></p> <p><a href="#_ednref6" name="_edn6">[vi]</a></p> 2020-07-02T00:00:00-06:00 Copyright (c) 2020 Harrison Dekker, Amy Riegelman