I am Research Faculty with the Michigan Center for Integrative Research in Critical Care (MCIRCC). Our team builds predictive algorithms, analyzes signals, and implements statistical models to advance Critical Care Medicine. We use electronic healthcare record data to build predictive algorithms. One example of this is Predicting Intensive Care Transfers and other Unforeseen Events (PICTURE), which uses commonly collected vital signs and labs to predict patient deterioration on the general hospital floor. Additionally, our team collects waveforms from the University Hospital, and we store this data utilizing Amazon Web Services. We use these signals to build predictive algorithms to advance precision medicine. Our flagship algorithm called Analytic for Hemodynamic Instability (AHI), predicts patient deterioration using a single-lead electrocardiogram signal. We use Bayesian methods to analyze metabolomic biomarker data from blood and exhaled breath to understand Sepsis and Acute Respiratory Distress Syndrome. I also have an interest in statistical genetics.
Jeffrey Regier received a PhD in statistics from UC Berkeley (2016) and joined the University of Michigan as an assistant professor. His research interests include graphical models, Bayesian inference, high-performance computing, deep learning, astronomy, and genomics.
Efficient, low regret contextual multi-armed bandit approaches for real time learning including Thompson sampling, UCB, and knowledge gradient descent. Integration of optimization and predictive analytics for determining the time to next measurement, which modality to use, and the optimal control of risk factors to manage chronic disease. Integration of soft voting ensemble classifiers and multiple models Kalman filters for disease state prediction, Real-time (online) contextual multi-armed bandits integrated with optimization of hospital bed type dynamic control decisions for reducing 30-day readmission rates in hospitals. Robustness in system optimization when the system model is uncertain with emphasis on quantile regression forests, sample average approximation, robust optimization and distributionally robust optimization. Health care delivery systems models with prediction and control for inpatient and outpatient. Work has been done on Emergency Department redesign for improved patient flow; Capacity management and planning and scheduling for outpatient care, including integrated services networks; admission control with machine learning to ICUs, stepdown, and regular care units Surgical planning and scheduling for access delay control; Planning and scheduling for Clinical Research Units.
The future of transportation lies at the intersection of two emerging trends, namely, the sharing economy and connected and automated vehicle technology. Our research group investigates the impact of these two major trends on the future of mobility, quantifying the benefits and identifying the challenges of integrating these technologies into our current systems.
Our research on shared-use mobility systems focuses on peer-to-peer (P2P) ridesharing and multi-modal transportation. We provide: (i) operational tools and decision support systems for shared-use mobility in legacy as well as connected and automated transportation systems. This line of research focuses on system design as well as routing, scheduling, and pricing mechanisms to serve on-demand transportation requests; (ii) insights for regulators and policy makers on mobility benefits of multi-modal transportation; (ii) planning tools that would allow for informed regulations of sharing economy.
In another line of research we investigate challenges faced by the connected automated vehicle technology before mass adoption of this technology can occur. Our research mainly focuses on (i) transition of control authority between the human driver and the autonomous entity in semi-autonomous (level 3 SAE autonomy) vehicles; (ii) incorporating network-level information supplied by connected vehicle technology into traditional trajectory planning; (iii) improving vehicle localization by taking advantage of opportunities provided by connected vehicles; and (iv) cybersecurity challenges in connected and automated systems. We seek to quantify the mobility and safety implications of this disruptive technology, and provide insights that can allow for informed regulations.
My research broadly focuses on developing data analytics and decision-making methodologies specifically tailored for Internet of Things (IoT) enabled smart and connected products/systems. I envision that most (if not all) engineering systems will eventually become connected systems in the future. Therefore, my key focus is on developing next-generation data analytics, machine learning, individualized informatics and graphical and network modeling tools to truly realize the competitive advantages that are promised by smart and connected products/systems.
Professor Seiford’s research interests are primarily in the areas of quality engineering, productivity analysis, process improvement, multiple-criteria decision making, and performance measurement. In addition, he is recognized as one of the world’s experts in the methodology of Data Envelopment Analysis. His current research involves the development of benchmarking models for identifying best-practice in manufacturing and service systems. He has written and co-authored four books and over one hundred articles in the areas of quality, productivity, operations management, process improvement, decision analysis, and decision support systems.
My research focus is on the development and application of machine learning tools to large scale financial and unstructured (textual) data to extract, quantify and predict risk profiles and investment grade rating of private and public companies. Example datasets include social media and financial aggregators such as Bloomberg, Pitchbook, and Privco.
Matthew Kay, PhD, is Assistant Professor of Information, School of Information and Assistant Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Kay’s research includes work on communicating uncertainty, usable statistics, and personal informatics. People are increasingly exposed to sensing and prediction in their daily lives (“how many steps did I take today?”, “how long until my bus shows up?”, “how much do I weigh?”). Uncertainty is both inherent to these systems and usually poorly communicated. To build understandable data presentations, we must study how people interpret their data and what goals they have for it, which informs the way that we should communicate results from our models, which in turn determines what models we must use in the first place. Prof. Kay tackles these problems using a multi-faceted approach, including qualitative and quantitative analysis of behavior, building and evaluating interactive systems, and designing and testing visualization techniques. His work draws on approaches from human-computer interaction, information visualization, and statistics to build information visualizations that people can more easily understand along with the models to back those visualizations.
Mingyan Liu, PhD, is Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Liu’s research interest lies in optimal resource allocation, sequential decision theory, online and machine learning, performance modeling, analysis, and design of large-scale, decentralized, stochastic and networked systems, using tools including stochastic control, optimization, game theory and mechanism design. Her most recent research activities involve sequential learning, modeling and mining of large scale Internet measurement data concerning cyber security, and incentive mechanisms for inter-dependent security games. Within this context, her research group is actively working on the following directions.
1. Cyber security incident forecast. The goal is to predict an organization’s likelihood of having a cyber security incident in the near future using a variety of externally collected Internet measurement data, some of which capture active maliciousness (e.g., spam and phishing/malware activities) while others capture more latent factors (e.g., misconfiguration and mismanagement). While machine learning techniques have been extensively used for detection in the cyber security literature, using them for prediction has rarely been done. This is the first study on the prediction of broad categories of security incidents on an organizational level. Our work to date shows that with the right choice of feature set, highly accurate predictions can be achieved with a forecasting window of 6-12 months. Given the increasing amount of high profile security incidents (Target, Home Depot, JP Morgan Chase, and Anthem, just to name a few) and the amount of social and economic cost they inflict, this work will have a major impact on cyber security risk management.
2. Detect propagation in temporal data and its application to identifying phishing activities. Phishing activities propagate from one network to another in a highly regular fashion, a phenomenon known as fast-flux, though how the destination networks are chosen by the malicious campaign remains unknown. An interesting challenge arises as to whether one can use community detection methods to automatically extract those networks involved in a single phishing campaign; the ability to do so would be critical to forensic analysis. While there have been many results on detecting communities defined as subsets of relatively strongly connected entities, the phishing activity exhibits a unique propagating property that is better captured using an epidemic model. By using a combination of epidemic modeling and regression we can identify this type of propagating community with reasonable accuracy; we are working on alternative methods as well.
3. Data-driven modeling of organizational and end-user security posture. We are working to build models that accurately capture the cyber security postures of end-users as well as organizations, using large quantities of Internet measurement data. One domain is on how software vendors disclose security vulnerabilities in their products, how they deploy software upgrades and patches, and in turn, how end users install these patches; all these elements combined lead to a better understanding of the overall state of vulnerability of a given machine and how that relates to user behaviors. Another domain concerns the interconnectedness of today’s Internet which implies that what we see from one network is inevitably related to others. We use this connection to gain better insight into the conditions of not just a single network viewed in isolation, but multiple networks viewed together.
Necmiye Ozay, PhD, is Assistant Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Ozay and her team develop the scientific foundations and associated algorithmic tools for compactly representing and analyzing heterogeneous data streams from sensor/information-rich networked dynamical systems. They take a unified dynamics-based and data-driven approach for the design of passive and active monitors for anomaly detection in such systems. Dynamical models naturally capture temporal (i.e., causal) relations within data streams. Moreover, one can use hybrid and networked dynamical models to capture, respectively, logical relations and interactions between different data sources. They study structural properties of networks and dynamics to understand fundamental limitations of anomaly detection from data. By recasting information extraction problem as a networked hybrid system identification problem, they bring to bear tools from computer science, system and control theory and convex optimization to efficiently and rigorously analyze and organize information. The applications include diagnostics, anomaly and change detection in critical infrastructure such as building management systems, transportation and energy networks.