CRRESS, Conference on Reproducibility and Replicability in Economics and the Social Sciences

Permanent URI for this collection

The Conference on Reproducibility and Replicability in Economics and the Social Sciences is a series of virtual and in-person panels on the topics of reproducibility, replicability, and transparency in the social sciences. The purpose of scientific publishing is the dissemination of robust research findings, exposing them to the scrutiny of peers and other interested parties. Scientific articles should accurately and completely provide information on the origin and provenance of data and on the analytical and computational methods used. Yet in recent years, doubts about the adequacy of the information provided in scientific articles and their addenda have been voiced. The conferences will address the following topics: the initiation of research, the conduct of research, the preparation of research for publication, and the scrutiny after publication. Undergraduates, graduate students, and career researchers will be able to learn about best practices for transparent, reproducible, and scientifically sound research in the social sciences.

Browse

Recent Submissions

Now showing 1 - 10 of 10
  • Item
    Session 10: The integration of reproducibility into social science graduate education.
    Reif, Julian; Freese, Jeremy; Wasser, David; Connolly, Marie (Labor Dynamics Institute, 2023-06-27)
    Incorporating reproducibility into the graduate curriculum is very appealing, since training early career scholars speeds the adoption of new research practices. However, adding new material to the graduate curriculum comes at a cost, since time spent learning the principles and practice of reproducibility requires effort that could be spent in other activities. With this in mind, should reproducibility be part of the core curriculum for doctoral students, taught as part of specialized methods courses, or just be available for those who seek it out? What are best practices for teaching reproducibility to graduate students? If they were included in the graduate curriculum, where might they fit and what might they replace?
  • Item
    Session 9: Reproducibility, confidentiality, and open data mandates 
    McGrail, Kimberly; Taylor, S. Martin; Lucas, Matthew; Connolly, Marie (Labor Dynamics Institute, 2023-05-30)
    Many granting agencies have adopted open data mandates. What is the interplay between reproducibility and those mandates? How can researchers be supported to meet those mandates, both in general, and specifically when data are confidential. At first glance, confidentiality and open data seem irreconciable, but could we find practices that both respect confidentiality and provide enough information and transparency to foster reproducibility?
  • Item
    Session 8: Should funders require reproducible archives?
    Halbert, Martin; Martinez, Sebastian; Buck, Stuart; Vilhuber, Lars (Labor Dynamics Institute, 2023-04-25)
    Both private and government funders of academic research have been increasingly requiring that data collected or created as part of funded research be made openly available. However, it is still rare that this requirement extends to computational artifacts, such as code, and even more rare that computational reproducibility is required. The panelists all work for funders, and have experience with various funding models and approaches. But only one of them currently enforces computational reproducibility of funded research.
  • Item
    Session 7: Why can or should research institutions publish replication packages?
    MacDonald, Graham; Peer, Limor; Butler, Courtney; Michuda, Aleksandr (Labor Dynamics Institute, 2023-03-28)
    This session brings various perspectives together on how research institutions think about publishing replication packages themselves, i.e., not a journal or generalist repository. Panelists come from a university with a specialized, university-centred data repository; from a Federal Reserve Bank with an active researcher community, and from a non-profit (non-academic) research institution. Each faces the requirements of varied internal researchers, external visibility, and differing audiences. The panelists can all speak to how a research institution makes decision about the degree of transparency, and how much of that to do with internal resources.
  • Item
    Session 6: Institutional support: How do journal reproducibility verification services work?
    Perignon, Christophe; Greiner, Ben; Christian, Thu-Mai; Connolly, Marie (Labor Dynamics Institute, 2023-02-28)
    When journals conduct active verification of replication packages, including accessing data and running code, how does that work? Can journals with limited resources still assess reproducibility? What depth of verification is optimal? Do journals provide a clear indication of whether an article was successfully reproduced?
  • Item
    Session 5: Disciplinary support: why is reproducibility not uniformly required across disciplines?
    Weeden, Kim; Sinclair, Betsy; Hoynes, Hilary; Vilhuber, Lars (Labor Dynamics Institute, 2023-01-31)
    Why do learned societies decide (or not) to implement data (and code) availability policies? What influences the level of enforcement, and the choice of "enforcer" (data editor, administrative staff, referees)? What are reasons NOT to require data sharing or code sharing?
  • Item
    Session 4: Reproducibility and confidential or proprietary data: can it be done?
    Horton, John; Guimarães, Paulo; Vilhuber, Lars; Michuda, Aleksandr (Labor Dynamics Institute, 2023-12-13)
    What happens to reproducibility when data are confidential or proprietary? Many journals can only ask that detailed access procedures be provided in a ReadMe file, but what mechanisms could be used to conduct computational reproducibility checks on such data? Should authors temporarily share their data with the journal for the purposes of reproducibility verification, even if they are not part of the public data replication package? Is it feasible to use a network of "insiders" to run code provided as part of a data replication package to assess reproducibility? Could a "certified run" be used?
  • Item
    Session 3: Should teaching reproducibility be a part of undergraduate education or curriculum? 
    Mendez-Carbajo, Diego; Ball, Richard; Vilhuber, Lars; Schmutte, Ian (Labor Dynamics Institute, 2022-11-20)
    Panelists will discuss teaching reproducibility (TIER Protocol), the involvement of undergraduates for replications based on restricted-access data, and other topics.
  • Item
    Session 2: Reproducibility and ethics - IRBs and beyond
    Meyer, Michelle M.; Swauger, Shea; Kopper, Sarah; Vilhuber, Lars (Labor Dynamics Institute, 2022-10-25)
    One of the most crucial dimensions that Institutional Review Boards are interested in are the protocols that researchers have in place to protect their subjects' privacy. This often leads to researchers writing in their IRB protocols that they will destroy their data once their project is complete. Understandably, however, destruction of data makes it impossible to verify and replicate work, which is increasingly becoming a vital part of modern science. How should data privacy be handled in the wake of the replication crisis? What protocols and standards should be put in place to minimize the risk of data leakage? Or should data be destroyed after some time span?
  • Item
    Session 1: Institutional support: Should journals verify reproducibility?
    Imbens, Guido; Salmon, Tim; Whited, Toni; Vilhuber, Lars (Labor Dynamics Institute, 2022-09-27)
    Different journals have different approaches towards enforcement of their data availability policies, ranging from a thorough and complete verification including running code and checking the output, to a cursory review of the files provided to make sure they appear satisfactory, to simply receiving the data and code package and archiving it on a website or a repository. What drives the choice of approach? What are the reasons behind such choices?