MIDAS Reproducibility Challenge Showcase: Xiaoquan (William) Wen & Yi Zhao – Quantify and Control Reproducibility in High-throughput Experiments

July 7, 2020 1:00 PM - 2:00 PM

View Event Recording

Xiaoquan (William) Wen* & Yi Zhao

*Associate Professor – Biostatistics, University of Michigan & ** University of Michigan Alum – Biostatistics

Quantify and Control Reproducibility in High-throughput Experiments

View Recording

Abstract: Despite the critical importance of reproducible research, the computational tools that aid quantifying and controlling reproducibility in high-throughput biological experiments are still lacking. In this talk, we first discuss an intuitive definition of reproducibility based on directional consistency (DC) when experimental units are assessed with signed effect size estimates. Based on this definition, we propose a Bayesian hierarchical model framework and a set of computational tools to i) assess the overall reproducibility of multiple studies through parameter estimation; and ii) evaluate reproducibility at the level of individual experimental units.  We will demonstrate the usage of the proposed methods in detecting unobserved batch effects in high-throughput experiments. We will further illustrate our proposed approaches by analyzing the data from transcriptome-wide association studies (TWAS). Specifically, we evaluate the reproducibility of two large-scale GWAS height datasets from the UK Biobank and the GIANT consortium in a TWAS analysis. By applying the same methodology, we investigate the tissue specificity of the TWAS signals in whole blood and skeletal muscle based on the eQTL data from the GTEx project. If time permits, we discuss the extensions of the proposed reproducibility measures and their potential applications in other vital areas of reproducible research (e.g., publication bias and conceptual replications).

The Reproducibility Showcase features a series of online presentations and tutorials from May to August, 2020.  Presenters are selected from the MIDAS Reproducibility Challenge 2020.  

A significant challenge across scientific fields is the reproducibility of research results, and third-party assessment of such reproducibility. The goal of the MIDAS Reproducibility Challenge is to highlight high-quality, reproducible work at the University of Michigan by collecting examples of best practices across diverse fields.  We received a large number of entries that illustrate wonderful work in the following areas: 

  1. Theory – A definition of reproducibility and what aspects of reproducibility are critical in a particular domain or in general.
  2. Reproducing a Particular Study – Comprehensive record of parameters and code that allows for others to reproduce the results in a particular project.
  3. Generalizable Tools – A general platform for coding or running analyses that standardizes the methods for reproducible results across studies.
  4. Robustness – Metadata, tools and processes to improve the robustness of results to variations in data, computational hardware and software, and human decisions.
  5. Assessments of Reproducibility – Methods to test the consistency of results from multiple projects, such as meta-analysis or the provision of parameters that can be compared across studies.
  6. Reproducibility under Constraints – Sharing code and/or data to reproduce results without violating privacy or other restrictions.

On Sept. 14, 2020, MIDAS will also host a Reproducibility Day, which is a workshop on concepts and best practices of research reproducibility.  Please save the date on your calendar.