Wei Lu

By |

Dr. Lu brings expertise in machine learning, particularly integrating human knowledge into machine learning and explainable machine learning. He has applied machine learning in a range of domain applications, such as autonomous driving and machine learning for optimized design and control of energy storage systems.

Omar Jamil Ahmed

By |

The Ahmed lab studies behavioral neural circuits and attempts to repair them when they go awry in neurological disorders. Working with patients and with transgenic rodent models, we focus on how space, time and speed are encoded by the spatial navigation and memory circuits of the brain. We also focus on how these same circuits go wrong in Alzheimer’s disease, Parkinson’s disease and epilepsy. Our research involves the collection of massive volumes of neural data. Within these terabytes of data, we work to identify and understand irregular activity patterns at the sub-millisecond level. This requires us to leverage high performance computing environments, and to design custom algorithmic and analytical signal processing solutions. As part of our research, we also discover new ways for the brain to encode information (how neurons encode sequences of space and time, for example) – and the algorithms utilized by these natural neural networks can have important implications for the design of more effective artificial neural networks.

Sardar Ansari

By |

I build data science tools to address challenges in medicine and clinical care. Specifically, I apply signal processing, image processing and machine learning techniques, including deep convolutional and recurrent neural networks and natural language processing, to aid diagnosis, prognosis and treatment of patients with acute and chronic conditions. In addition, I conduct research on novel approaches to represent clinical data and combine supervised and unsupervised methods to improve model performance and reduce the labeling burden. Another active area of my research is design, implementation and utilization of novel wearable devices for non-invasive patient monitoring in hospital and at home. This includes integration of the information that is measured by wearables with the data available in the electronic health records, including medical codes, waveforms and images, among others. Another area of my research involves linear, non-linear and discrete optimization and queuing theory to build new solutions for healthcare logistic planning, including stochastic approximation methods to model complex systems such as dispatch policies for emergency systems with multi-server dispatches, variable server load, multiple priority levels, etc.

Mark Steven Cohen

By |

In his various roles, he has helped develop several educational programs in Innovation and Entrepreneurial Development (the only one of their kind in the world) for medical students, residents, and faculty as well as co-founding 4 start-up companies (including a consulting group, a pharmaceutical company, a device company, and a digital health startup) to improve the care of surgical patients and patients with cancer. He has given over 80 invited talks both nationally and internationally, written and published over 110 original scientific articles, 12 book chapters, as well as a textbook on “Success in Academic Surgery: Innovation and Entrepreneurship” published in 2019 by Springer-NATURE. His research is focused on drug development and nanoparticle drug delivery for cancer therapeutic development as well as evaluation of circulating tumor cells, tissue engineering for development of thyroid organoids, and evaluating the role of mixed reality technologies, AI and ML in surgical simulation, education and clinical care delivery as well as directing the Center for Surgical Innovation at Michigan. He has been externally funded for 13 consecutive years by donors and grants from Susan G. Komen Foundation, the American Cancer Society, and he currently has funding from three National Institute of Health R-01 grants through the National Cancer Institute. He has served on several grant study sections for the National Science Foundation, the National Institute of Health, the Department of Defense, and the Susan G. Komen Foundation. He also serves of several scientific journal editorial boards and has serves on committees and leadership roles in the Association for Academic Surgery, the Society of University Surgeons and the American Association of Endocrine Surgeons where he was the National Program Chair in 2013. For his innovation efforts, he was awarded a Distinguished Faculty Recognition Award by the University of Michigan in 2019. His clinical interests and national expertise are in the areas of Endocrine Surgery: specifically thyroid surgery for benign and malignant disease, minimally invasive thyroid and parathyroid surgery, and adrenal surgery, as well as advanced Melanoma Surgery including developing and running the hyperthermic isolated limb perfusion program for in transit metastatic melanoma (the only one in the state of Michigan) which is now one of the largest in the nation.

Jesse Hamilton

By |

My research focuses on the development of novel Magnetic Resonance Imaging (MRI) technology for imaging the heart. We focus in particular on quantitative imaging techniques, in which the signal intensity at each pixel in an image represents a measurement of an inherent property of a tissue. Much of our research is based on cardiac Magnetic Resonance Fingerprinting (MRF), which is a class of methods for simultaneously measuring multiple tissue properties from one rapid acquisition.

Our group is exploring novel ways to combine physics-based modeling of MRI scans with deep learning algorithms for several purposes. First, we are exploring the use of deep learning to design quantitative MRI scans with improved accuracy and precision. Second, we are developing deep learning approaches for image reconstruction that will allow us to reduce image noise, improve spatial resolution and volumetric coverage, and enable highly accelerated acquisitions to shorten scan times. Third, we are exploring ways of using artificial intelligence to derive physiological motion signals directly from MRI data to enable continuous scanning that is robust to cardiac and breathing motion. In general, we focus on algorithms that are either self-supervised or use training data generated in computer simulations, since the collection of large amounts of training data from human subjects is often impractical when designing novel imaging methods.

Kathryn Luker

By |

As an expert in molecular imaging of single cell signaling in cancer, I develop integrated systems of molecular, cellular, optical, and custom image processing tools to extract rich data sets for biochemical and behavioral functions in living cells over minutes to days. Data sets composed of thousands to millions of cells enable us to develop predictive models of cellular function through a variety of computational approaches, including ODE, ABM, and IRL modeling.

Xianglei Huang

By |

Prof. Huang is specialized in satellite remote sensing, atmospheric radiation, and climate modeling. Optimization, pattern analysis, and dimensional reduction are extensively used in his research for explaining observed spectrally resolved infrared spectra, estimating geophysical parameters from such hyperspectral observations, and deducing human influence on the climate in the presence of natural variability of the climate system. His group has also developed a deep-learning model to make a data-driven solar forecast model for use in the renewable energy sector.

Joyce Penner

By |

I am new to researching in Artificial Intelligence used in Atmospheric Sciences. Previous experience is in comparing satellite data products with 3-D global simulations.

Sindhu Kutty

By |

My research centers on studying the interaction between abstract, theoretically sound probabilistic algorithms and human beings. One aspect of my research explores connections of Machine Learning to Crowdsourcing and Economics; focused in both cases on better understanding the aggregation process. As Machine Learning algorithms are used in making decisions that affect human lives, I am interested in evaluating the fairness of Machine Learning algorithms as well as exploring various paradigms of fairness. I study how these notions interact with more traditional performance metrics. My research in Computer Science Education focuses on developing and using evidence-based techniques in educating undergraduates in Machine Learning. To this end, I have developed a pilot summer program to introduce students to current Machine Learning research and enable them to make a more informed decision about what role they would like research to play in their future. I have also mentored (and continue to mentor) undergraduate students and work with students to produce publishable, and award-winning, undergraduate research.

Mithun Chakraborty

By |

My broad research interests are in multi-agent systems, computational economics and finance, and artificial intelligence. I apply techniques from algorithmic game theory, statistical machine learning, decision theory, etc. to a variety of problems at the intersection of the computational and social sciences. A major focus of my research has been the design and analysis of market-making algorithms for financial markets and, in particular, prediction markets — incentive-based mechanisms for aggregating data in the form of private beliefs about uncertain events (e.g. the outcome of an election) distributed among strategic agents. I use both analytical and simulation-based methods to investigate the impact of factors such as wealth, risk attitude, manipulative behavior, etc. on information aggregation in market ecosystems. Another line of work I am pursuing involves algorithms for allocating resources based on preference data collected from potential recipients, satisfying efficiency, fairness, and diversity criteria; my joint work on ethnicity quotas in Singapore public housing allocation deserves special mention in this vein. More recently, I have got involved in research on empirical game-theoretic analysis, a family of methods for building tractable models of complex, procedurally defined games from empirical/simulated payoff data and using them to reason about game outcomes.