The Reproducibility Showcase features a series of online presentations and tutorials from May to August, 2020.  Presenters are selected from the MIDAS Reproducibility Challenge 2020.  

A significant challenge across scientific fields is the reproducibility of research results, and third-party assessment of such reproducibility. The goal of the MIDAS Reproducibility Challenge is to highlight high-quality, reproducible work at the University of Michigan by collecting examples of best practices across diverse fields.  We received a large number of entries that illustrate wonderful work in the following areas: 

    1. Theory – A definition of reproducibility and what aspects of reproducibility are critical in a particular domain or in general.
    2. Reproducing a Particular Study – Comprehensive record of parameters and code that allows for others to reproduce the results in a particular project.
    3. Generalizable Tools – A general platform for coding or running analyses that standardizes the methods for reproducible results across studies.
    4. Robustness – Metadata, tools and processes to improve the robustness of results to variations in data, computational hardware and software, and human decisions.
    5. Assessments of Reproducibility – Methods to test the consistency of results from multiple projects, such as meta-analysis or the provision of parameters that can be compared across studies.
    6. Reproducibility under Constraints – Sharing code and/or data to reproduce results without violating privacy or other restrictions.

This showcase will lead up to our Reproducibility Day on Sept. 14, 2020.

May 15, 1 – 2pm: Everyday Reproducibility: A multi-pronged approach to ensure analyses are fully reproducible, easy to access, and easy to use, Johann Gagnon-Bartsch, Assistant Professor, Statistics. (View Recording)

June 9, 1 – 2pm: American Economic Association (AEA) Data and Code Repository at open ICPSR Jared Lyle (View Jared Lyle Recording), Archivist, ICPSR &  Data-specific functions Jacob Fisher (View Jacob Fisher Recording), Research Investigator, Survey Research Center. (View Recording)

June 23, 2 – 3pm: XOD: The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability and data reproducibility Oliver He (View Recording, Oliver He), Associate Professor, Microbiology & Immunology & C2Metadata: Continuous Capture of Metadata for Statistical Data Jie Song (View Recording, Jie Song), Graduate Student Research Assistant, ICPSR. (View Full Recording)

July 7, 1 – 2pm: Quantify and Control Reproducibility in High-throughput Experiments Xiaoquan (William) Wen, Associate Professor, Biostatistics & Yi Zhao, University of Michigan Alum (View Recording)

July 21, 1 – 2pm: What Works Best When? A Systematic Evaluation of Heuristics for Max-Cut and QUBO John Silberholz (View John Silberholz Recording), Assistant Professor, Ross School of Business, University of Michigan, together with collaborators Iain Dunning and Swati Gupta; A Multi-informatic Cellular Visualization tool for interactively interrogating high-dimensional datasets  Nigel Michki (View Nigel Michki Recording), Doctoral Student, Biophysics, University of Michigan (View Full Recording)

July 28, 1 – 2pm: An Open Software Approach for Reproducible Research for Materials Design, Sharon Glotzer, Professor, Chemical Engineering and Materials Science & Engineering (View Recording)

August 4, 1 – 2pm: Replicate.Education: Lessons Learned Building a Platform for Education Data Science Replications, Chris Brooks, Assistant Professor of Information, School of Information  (View Recording)

August 25, 1 – 2pm: Establishing systematic practices to share statistical code Thomas Valley, Assistant Professor, Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine (View Recording)