MIDAS Reproducibility Challenge Showcase: John Silberholz & Nigel Michki

July 21, 2020 1:00 PM - 2:00 PM

View Event Recording

View Full Recording

What Works Best When? A Systematic Evaluation of Heuristics for Max-Cut and QUBO

Iain Dunning (Hudson River Trading), Swati Gupta (Assistant Professor, Industrial and Systems Engineering, Georgia Tech), John Silberholz (Assistant Professor, Ross School of Business, University of Michigan)

View John Silberholz Recording

A Multi-informatic Cellular Visualization tool for interactively interrogating high-dimensional datasets

Nigel Michki (Doctoral Student – Biophysics, University of Michigan)

View Nigel Michki Recording

What Works Best When? A Systematic Evaluation of Heuristics for Max-Cut and QUBO: Though empirical testing is broadly used to evaluate heuristics, there are shortcomings with how it is often applied in practice. In a systematic review of Max-Cut and Quadratic Unconstrained Binary Optimization (QUBO) heuristics papers, we found only 4% publish source code, only 14% compare heuristics with identical termination criteria, and most experiments are performed with an artificial, homogeneous set of problem instances. To address these limitations, we implement and release as open-source a code-base of 10 MaxCut and 27 QUBO heuristics, which we evaluated using cloud computing on a diverse library of 3,296 instances.

A Multi-informatic Cellular Visualization tool for interactively interrogating high-dimensional datasets: Biologists looking to scRNA-seq as a high-throughput exploratory method are often bogged down by the multitude of bioinformatics tools and pipelines available today. This is unfortunate because many tools have matured beyond the shortcomings of their predecessors and are capable of drawing informative suggestions without the need for elaborate parameter tuning.  MiCV, the Multi-informatic Cellular Visualization tool that we have built in this work, is our answer to the questions: “Which pipeline should I be using? How long will it take me to learn how to use it? And, how can I communicate with my colleagues or reviewers if they do not know the same tools?” By providing a uniform web interface to a set of essential analytical tools, we aim to accelerate the pace at which biologists can make novel discoveries on their own datasets, validate other researchers’ findings, and dig deeper into the many massive scRNA-seq datasets being rapidly generated around the globe.

—————————————————————

The Reproducibility Showcase features a series of online presentations and tutorials from May to August, 2020.  Presenters are selected from the MIDAS Reproducibility Challenge 2020.  

A significant challenge across scientific fields is the reproducibility of research results, and third-party assessment of such reproducibility. The goal of the MIDAS Reproducibility Challenge is to highlight high-quality, reproducible work at the University of Michigan by collecting examples of best practices across diverse fields.  We received a large number of entries that illustrate wonderful work in the following areas: 

  1. Theory – A definition of reproducibility and what aspects of reproducibility are critical in a particular domain or in general.
  2. Reproducing a Particular Study – Comprehensive record of parameters and code that allows for others to reproduce the results in a particular project.
  3. Generalizable Tools – A general platform for coding or running analyses that standardizes the methods for reproducible results across studies.
  4. Robustness – Metadata, tools and processes to improve the robustness of results to variations in data, computational hardware and software, and human decisions.
  5. Assessments of Reproducibility – Methods to test the consistency of results from multiple projects, such as meta-analysis or the provision of parameters that can be compared across studies.
  6. Reproducibility under Constraints – Sharing code and/or data to reproduce results without violating privacy or other restrictions.

On Sept. 14, 2020, MIDAS will also host a Reproducibility Day, which is a workshop on concepts and best practices of research reproducibility.  Please save the date on your calendar.