My research focuses on using digital health solutions, signal processing, machine learning and ecological momentary assessment to understand the physiological and psychological determinants of symptoms in patients with atrial fibrillation. I am building a research framework for rich data collection using smartphone apps, medical records and wearable sensors. I believe that creating a multidimensional dataset to study atrial fibrillation will yield important insights and serve as model for studying all chronic medical conditions.
His research interest lies in the intersection of signal processing, data science, machine learning, and numerical optimization. He is particularly interested in computational methods for learning low-complexity models from high-dimensional data, leveraging tools from machine learning, numerical optimization, and high dimensional geometry, with applications in imaging sciences, scientific discovery, and healthcare. Recently, he is also interested in understanding deep networks through the lens of low-dimensional modeling.
Yixin Wang works in the fields of Bayesian statistics, machine learning, and causal inference, with applications to recommender systems, text data, and genetics. She also works on algorithmic fairness and reinforcement learning, often via connections to causality. Her research centers around developing practical and trustworthy machine learning algorithms for large datasets that can enhance scientific understandings and inform daily decision-making. Her research interests lie in the intersection of theory and applications.
My research focuses on computer security and privacy, with an emphasis on problems that broadly impact society and public policy. Topics that interest me include software security, network security, data privacy, anonymity, election cybersecurity, censorship resistance, computer forensics, ethics, and cybercrime. I’m also interested in the interaction of technology with politics and international affairs.
My broad research interests are in multi-agent systems, computational economics and finance, and artificial intelligence. I apply techniques from algorithmic game theory, statistical machine learning, decision theory, etc. to a variety of problems at the intersection of the computational and social sciences. A major focus of my research has been the design and analysis of market-making algorithms for financial markets and, in particular, prediction markets — incentive-based mechanisms for aggregating data in the form of private beliefs about uncertain events (e.g. the outcome of an election) distributed among strategic agents. I use both analytical and simulation-based methods to investigate the impact of factors such as wealth, risk attitude, manipulative behavior, etc. on information aggregation in market ecosystems. Another line of work I am pursuing involves algorithms for allocating resources based on preference data collected from potential recipients, satisfying efficiency, fairness, and diversity criteria; my joint work on ethnicity quotas in Singapore public housing allocation deserves special mention in this vein. More recently, I have got involved in research on empirical game-theoretic analysis, a family of methods for building tractable models of complex, procedurally defined games from empirical/simulated payoff data and using them to reason about game outcomes.
Eric Gilbert is the John Derby Evans Associate Professor in the School of Information—and a Professor in CSE—at the University of Michigan. Before coming to Michigan, he led the comp.social lab at Georgia Tech. Dr. Gilbert is a sociotechnologist, with a research focus on building and studying social media systems. His work has been supported by grants from Facebook, Samsung, Yahoo!, Google, NSF, ARL, and DARPA. Dr. Gilbert’s work has been recognized with multiple best paper awards, as well as covered by outlets including Wired, NPR and The New York Times. He is the recipient of an NSF CAREER award and the Sigma Xi Young Faculty Award. Professor Gilbert holds a BS in Math & CS and a PhD in CS—both from from the University of Illinois at Urbana-Champaign.
Albert S. Berahas is an Assistant Professor in the department of Industrial & Operations Engineering. His research broadly focuses on designing, developing and analyzing algorithms for solving large scale nonlinear optimization problems. Such problems are ubiquitous, and arise in a plethora of areas such as engineering design, economics, transportation, robotics, machine learning and statistics. Specifically, he is interested in and has explored several sub-fields of nonlinear optimization such as: (i) general nonlinear optimization algorithms, (ii) optimization algorithms for machine learning, (iii) constrained optimization, (iv) stochastic optimization, (v) derivative-free optimization, and (vi) distributed optimization.
Harrison Crandall is a Web Developer and Social Media Assistant at MIDAS. He is a Senior at the University of Michigan, with a passion for Front-end Computer Science and Data Science. Harrison lives in Larchmont, New York and in 2015 he founded a company named Larchmont Web Design to create websites for local businesses. Prior to working at MIDAS he also worked as a Web Developer at an advertising agency in Norwalk, CT.
His research is broadly in the interplay of complex stochastic systems and big-data, including large-scale communication/computing systems for big-data processing, private data marketplaces, and large-scale graph mining.