Michael is an Assistant Professor of Energy Systems at the University of Michigan’s School for Environment and Sustainability and PI of the ASSET Lab. He researches how to equitably reduce global and local environmental impacts of energy systems while making those systems robust to future climate change. His research advances energy system models to address new challenges driven by decarbonization, climate adaptation, and equity objectives. He then applies these models to real-world systems to generate decision-relevant insights that account for engineering, economic, climatic, and policy features. His energy system models leverage optimization and simulation methods, depending on the problem at hand. Applying these models to climate mitigation or adaptation in real-world systems often runs into computational limits, which he overcomes through clustering, sampling, and other data reduction algorithms. His current interdisciplinary collaborations include climate scientists, hydrologists, economists, urban planners, epidemiologists, and diverse engineers.
My research focus the application and development of new algorithms for solving complex business analytics problems. Applications vary from revenue management, dynamic pricing, marketing analytics, to retail logistics. In terms of methodology, I use a combination of operations research and machine learning/online optimization techniques.
Dr. Kochunas’s research focus is on the next generation of numerical methods and parallel algorithms for high fidelity computational reactor physics and how to leverage these capabilities to develop digital twins. His group’s areas of expertise include neutron transport, nuclide transmutation, multi-physics, parallel programming, and HPC architectures. The Nuclear Reactor Analysis and Methods (NURAM) group is also developing techniques that integrate data-driven methods with conventional approaches in numerical analysis to produce “hybrid models” for accurate, real-time modeling applications. This is embodied by his recent efforts to combine high-fidelity simulation results simulation models in virtual reality through the Virtual Ford Nuclear Reactor.
My lab researches how the human brain processes social and affective information and how these processes are affected in psychiatric disorders, especially schizophrenia and bipolar disorder. We use behavioral, electrophysiological (EEG), neuroimaging (functional MRI), eye tracking, brain stimulation (TMS, tACS), and computational methods in our studies. One main focus of our work is building and validating computational models based on intensive, high-dimensional subject-level behavior and brain data to explain clinical phenomena, parse mechanisms, and predict patient outcome. The goal is to improve diagnostic and prognostic assessment, and to develop personalized treatments.
My research focuses on understanding the social cognitive, affective, and biological factors that shape our closest relationships. I am particularly interested in identifying factors that help romantic couples and families maintain high quality relationships. My work draws upon a variety of methods, including experimental, observational, naturalistic (e.g., daily experience), and physiological, to capture people at multiple levels in a variety of social situations. I frequently gather dyadic longitudinal data in order to understand how relationship partners influence each other in the moment and over time.
Uncertainty quantification and decision making are increasingly demanded with the development of future technology in engineering and transportation systems. Among the uncertainty quantification problems, Dr. Wenbo Sun is particularly interested in statistical modelling of engineering system responses with considering the high dimensionality and complicated correlation structure, as well as quantifying the uncertainty from a variety of sources simultaneously, such as the inexactness of large-scale computer experiments, process variations, and measurement noises. He is also interested in data-driven decision making that is robust to the uncertainty. Specifically, he delivers methodologies for anomaly detection and system design optimization, which can be applied to manufacturing process monitoring, distracted driving detection, out-of-distribution object identification, vehicle safety design optimization, etc.
My research focuses on building infrastructure for public health and health science research organizations to take advantage of cloud computing, strong software engineering practices, and MLOps (machine learning operations). By equipping biomedical research groups with tools that facilitate automation, better documentation, and portable code, we can improve the reproducibility and rigor of science while scaling up the kind of data collection and analysis possible.
Research topics include:
1. Open source software and cloud infrastructure for research,
2. Software development practices and conventions that work for academic units, like labs or research centers, and
3. The organizational factors that encourage best practices in reproducibility, data management, and transparency
The practice of science is a tug of war between competing incentives: the drive to do a lot fast, and the need to generate reproducible work. As data grows in size, code increases in complexity and the number of collaborators and institutions involved goes up, it becomes harder to preserve all the “artifacts” needed to understand and recreate your own work. Technical AND cultural solutions will be needed to keep data-centric research rigorous, shareable, and transparent to the broader scientific community.
As an environmental epidemiologist and in collaboration with government and community partners, I study how social, economic, health, and built environment characteristics and/or air quality affect vulnerability to extreme heat and extreme precipitation. This research will help cities understand how to adapt to heat, heat waves, higher pollen levels, and heavy rainfall in a changing climate.
The Ahmed lab studies behavioral neural circuits and attempts to repair them when they go awry in neurological disorders. Working with patients and with transgenic rodent models, we focus on how space, time and speed are encoded by the spatial navigation and memory circuits of the brain. We also focus on how these same circuits go wrong in Alzheimer’s disease, Parkinson’s disease and epilepsy. Our research involves the collection of massive volumes of neural data. Within these terabytes of data, we work to identify and understand irregular activity patterns at the sub-millisecond level. This requires us to leverage high performance computing environments, and to design custom algorithmic and analytical signal processing solutions. As part of our research, we also discover new ways for the brain to encode information (how neurons encode sequences of space and time, for example) – and the algorithms utilized by these natural neural networks can have important implications for the design of more effective artificial neural networks.
I build data science tools to address challenges in medicine and clinical care. Specifically, I apply signal processing, image processing and machine learning techniques, including deep convolutional and recurrent neural networks and natural language processing, to aid diagnosis, prognosis and treatment of patients with acute and chronic conditions. In addition, I conduct research on novel approaches to represent clinical data and combine supervised and unsupervised methods to improve model performance and reduce the labeling burden. Another active area of my research is design, implementation and utilization of novel wearable devices for non-invasive patient monitoring in hospital and at home. This includes integration of the information that is measured by wearables with the data available in the electronic health records, including medical codes, waveforms and images, among others. Another area of my research involves linear, non-linear and discrete optimization and queuing theory to build new solutions for healthcare logistic planning, including stochastic approximation methods to model complex systems such as dispatch policies for emergency systems with multi-server dispatches, variable server load, multiple priority levels, etc.