Professor Seiford’s research interests are primarily in the areas of quality engineering, productivity analysis, process improvement, multiple-criteria decision making, and performance measurement. In addition, he is recognized as one of the world’s experts in the methodology of Data Envelopment Analysis. His current research involves the development of benchmarking models for identifying best-practice in manufacturing and service systems. He has written and co-authored four books and over one hundred articles in the areas of quality, productivity, operations management, process improvement, decision analysis, and decision support systems.
Matias D. Cattaneo, Ph.D., is Professor of Economics and Statistics in the College of Literature, Science, and the Arts at the University of Michigan, Ann Arbor.
Prof. Cattaneo’s research interests include econometric theory, mathematical statistics, and applied econometrics, with focus on causal inference, program evaluation, high-dimensional problems and applied microeconomics. Most of his recent research relates to the development of new, improved semiparametric, nonparametric and high-dimensional inference procedures exhibiting demonstrable superior robustness properties with respect to tuning parameter and other implementation choices. His work is motivated by concrete empirical problems in social, biomedical and statistical sciences, covering a wide array of topics in settings related to treatment effects and policy evaluation, high-dimensional models, average derivatives and structural response functions, applied finance and applied decision theory, among others.
Ding Zhao, PhD, is Assistant Research Scientist in the department of Mechanical Engineering, College of Engineering with a secondary appointment in the Robotics Institute at The University of Michigan, Ann Arbor.
Dr. Zhao’s research interests include autonomous vehicles, intelligent/connected transportation, traffic safety, human-machine interaction, rare events analysis, dynamics and control, machine learning, and big data analysis
My research focus is on the development and application of machine learning tools to large scale financial and unstructured (textual) data to extract, quantify and predict risk profiles and investment grade rating of private and public companies. Example datasets include social media and financial aggregators such as Bloomberg, Pitchbook, and Privco.
Matthew Kay, PhD, is Assistant Professor of Information, School of Information and Assistant Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Kay’s research includes work on communicating uncertainty, usable statistics, and personal informatics. People are increasingly exposed to sensing and prediction in their daily lives (“how many steps did I take today?”, “how long until my bus shows up?”, “how much do I weigh?”). Uncertainty is both inherent to these systems and usually poorly communicated. To build understandable data presentations, we must study how people interpret their data and what goals they have for it, which informs the way that we should communicate results from our models, which in turn determines what models we must use in the first place. Prof. Kay tackles these problems using a multi-faceted approach, including qualitative and quantitative analysis of behavior, building and evaluating interactive systems, and designing and testing visualization techniques. His work draws on approaches from human-computer interaction, information visualization, and statistics to build information visualizations that people can more easily understand along with the models to back those visualizations.
Mingyan Liu, PhD, is Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Liu’s research interest lies in optimal resource allocation, sequential decision theory, online and machine learning, performance modeling, analysis, and design of large-scale, decentralized, stochastic and networked systems, using tools including stochastic control, optimization, game theory and mechanism design. Her most recent research activities involve sequential learning, modeling and mining of large scale Internet measurement data concerning cyber security, and incentive mechanisms for inter-dependent security games. Within this context, her research group is actively working on the following directions.
1. Cyber security incident forecast. The goal is to predict an organization’s likelihood of having a cyber security incident in the near future using a variety of externally collected Internet measurement data, some of which capture active maliciousness (e.g., spam and phishing/malware activities) while others capture more latent factors (e.g., misconfiguration and mismanagement). While machine learning techniques have been extensively used for detection in the cyber security literature, using them for prediction has rarely been done. This is the first study on the prediction of broad categories of security incidents on an organizational level. Our work to date shows that with the right choice of feature set, highly accurate predictions can be achieved with a forecasting window of 6-12 months. Given the increasing amount of high profile security incidents (Target, Home Depot, JP Morgan Chase, and Anthem, just to name a few) and the amount of social and economic cost they inflict, this work will have a major impact on cyber security risk management.
2. Detect propagation in temporal data and its application to identifying phishing activities. Phishing activities propagate from one network to another in a highly regular fashion, a phenomenon known as fast-flux, though how the destination networks are chosen by the malicious campaign remains unknown. An interesting challenge arises as to whether one can use community detection methods to automatically extract those networks involved in a single phishing campaign; the ability to do so would be critical to forensic analysis. While there have been many results on detecting communities defined as subsets of relatively strongly connected entities, the phishing activity exhibits a unique propagating property that is better captured using an epidemic model. By using a combination of epidemic modeling and regression we can identify this type of propagating community with reasonable accuracy; we are working on alternative methods as well.
3. Data-driven modeling of organizational and end-user security posture. We are working to build models that accurately capture the cyber security postures of end-users as well as organizations, using large quantities of Internet measurement data. One domain is on how software vendors disclose security vulnerabilities in their products, how they deploy software upgrades and patches, and in turn, how end users install these patches; all these elements combined lead to a better understanding of the overall state of vulnerability of a given machine and how that relates to user behaviors. Another domain concerns the interconnectedness of today’s Internet which implies that what we see from one network is inevitably related to others. We use this connection to gain better insight into the conditions of not just a single network viewed in isolation, but multiple networks viewed together.
Necmiye Ozay, PhD, is Assistant Professor of Electrical Engineering and Computer Science, College of Engineering, at the University of Michigan, Ann Arbor.
Prof. Ozay and her team develop the scientific foundations and associated algorithmic tools for compactly representing and analyzing heterogeneous data streams from sensor/information-rich networked dynamical systems. They take a unified dynamics-based and data-driven approach for the design of passive and active monitors for anomaly detection in such systems. Dynamical models naturally capture temporal (i.e., causal) relations within data streams. Moreover, one can use hybrid and networked dynamical models to capture, respectively, logical relations and interactions between different data sources. They study structural properties of networks and dynamics to understand fundamental limitations of anomaly detection from data. By recasting information extraction problem as a networked hybrid system identification problem, they bring to bear tools from computer science, system and control theory and convex optimization to efficiently and rigorously analyze and organize information. The applications include diagnostics, anomaly and change detection in critical infrastructure such as building management systems, transportation and energy networks.
My research interests are in developing inter-disciplinary knowledge in System Informatics, as the basis for study of complex system problems with the fusion of theory, computation, and application components adopted from Systems and Informatics fields. In this framework, a complex system such as the supply chain is posited as a System-of-Systems; i.e., a collection of individual business entities organized as a composite system with their resources and capabilities pooled to obtain an interoperable and synergistic system, possessing common and shared goals and objectives. Informatics facilitates coordination and integration in the system by processing and sharing information among supply chain entities for improved decision-making.
A common theme of my research is the basic foundation of universality of system and the realization that what makes it unique is its environment. This has enabled to categorize problems, designs, models, methodologies, and solution techniques at macro and micro levels and develop innovative solutions by coordinating these levels in an integrated environment.
My goal is to study the efficacy of the body of knowledge available in Systems Theory, Information Science, Artificial Intelligence & Knowledge Management, Management Science, Industrial Engineering and Operations Research fields; applied uniquely to issues and problems of complex systems in the manufacturing and service sectors.
Theoretical work investigated by me in this research thrust relates to:
- Developing Generalized System Taxonomies and Ontologies for complex systems management.
- Experimenting with Problem Taxonomies for design and modeling efficiencies in complex system networks.
- Developing methodologies, frameworks and reference models for complex systems management.
- Computation and application development focused on developing algorithms and software development for:
- Supply chain information system and knowledge library using Web-based technology as a dissemination tool.
- Integration with Enterprise Resource Planning modules in SAP software.
- Supply chain management problem-solving through application of problem specific simulation and optimization.
My research has extended to application domains in healthcare, textiles, automotive, and defense sectors. Problems and issues addressed relate to health care management, operationalizing of sustainability, energy conservation, global logistics management, mega-disaster recovery, humanitarian needs management, and entrepreneurship management.
Currently, my application focus is on expanding the breadth and depth of inquiry in the healthcare domain. Among the topics being investigated are: (1) the organization and structure of health care enterprises; and (2) operations and strategies that relate to management of critical success factors, such as costs, quality, innovation and technology adoption by health care providers. Two significant topics that I have chosen to study with regard to care for elderly patients suffering from chronic congestive heart failure and hypertension are: (1) the design of patient-centered health care delivery to improve quality of care; and (2) managing enhanced care costs due to readmission of these patients.
Data science applications: Real-time data processing in supply chains, Knowledge portals for decision-making in supply chains, information sharing for optimizing patient-centered healthcare delivery
Professor Subramanian is interested in a variety of stochastic modeling, decision and control theoretic, and applied probability questions concerned with networks. Examples include analysis of random graphs, analysis of processes like cascades on random graphs, network economics, analysis of e-commerce systems, mean-field games, network games, telecommunication networks, load-balancing in large server farms, and information assimilation, aggregation and flow in networks especially with strategic users.
My lab creates systems that use a combination of both human and machine computation to solve problems quickly and reliably. We have introduced the idea of continuous real-time crowdsourcing, as well as the ‘crowd agent’ model, which uses computer-mediated groups of people submitting input simultaneously to create a collective intelligence capable of completing tasks better than any constituent member.