My methodological research focus on developing statistical methods for routinely collected healthcare databases such as electronic health records (EHR) and claims data. I aim to tackle the unique challenges that arise from the secondary use of real-world data for research purposes. Specifically, I develop novel causal inference methods and semiparametric efficiency theory that harness the full potential of EHR data to address comparative effectiveness and safety questions. I develop scalable and automated pipelines for curation and harmonization of EHR data across healthcare systems and coding systems.
Fred Conrad’s research concerns the development of new methods and data sources for conducting social research. His work is largely focused on survey methodology, but he also explores the use of social media content as a complement to survey data and as a source of large-scale qualitative insights. His focus is on data quality and reducing measurement error. For example, live video interviews promote more thoughtful responses, e.g., less straightlining – the tendency to give the same answer to a battery of survey questions, but they also promote less candor when answering questions on sensitive topics. Measurement error in social media include misclassification in the automated interpretation of content using methods such as sentiment analysis and topic modeling, as well as selective self-presentation (only posting flattering content). Equally challenging is not knowing the extent to which users differ from the population to which one might wish to generalize results.
I work on the analysis of sports as they relate to economics, business, finance, history, performance modeling, analytics and prediction/forecasting. I typically use panel data econometric techniques to understand team performance in professional sports. I also have interest in forecast models.
Professor Kowalski’s recent research analyzes experiments and clinical trials with the goal of designing policies to target insurance expansions and medical treatments to individuals who stand to benefit from them the most. Her research has also explored the impact of previous Medicaid expansions, the Affordable Care Act, the Massachusetts health reform of 2006, and employer-sponsored health insurance plans. She has also used cutting-edge techniques to estimate the value of medical spending on at-risk newborns.
We are interested in resolving outstanding fundamental scientific problems that impede the computational materials design process. Our group uses high-throughput density functional theory, applied thermodynamics, and materials informatics to deepen our fundamental understanding of synthesis-structure-property relationships, while exploring new chemical spaces for functional technological materials. These research interests are driven by the practical goal of the U.S. Materials Genome Initiative to accelerate materials discovery, but whose resolution requires basic fundamental research in synthesis science, inorganic chemistry, and materials thermodynamics.
My research interests lie in design and analysis of randomized controlled trials (RCTs), partial identification, identification and inference with multi-valued treatments and instruments, and quantile regression. In one recent paper I study the optimal stratified randomization procedure in RCTs, and found a certain kind of matched-pair design is optimal. In another paper (coauthored with Joe Romano and Azeem Shaikh), we provide asymptotically exact inference procedure for matched-pair designs. In another paper we study inference with moment inequalities whose dimension grows exponentially fast with the sample size. I also have a paper in which we study the sharp identified sets for various treatment effects with multi-valued instruments and multi-values treatments.
My primary research is focused on measurement and monitoring of risks in banks, both at the individual bank level and at the level of financial system as a whole. In a recent paper, we have developed a high-dimension statistical approach to measure connectivity across different players in the financial sector. We implement our model using stock return data for US banks, insurance companies and hedge funds. Some of my early research has developed analytical tools to measure banks’ default risk using option pricing models and other tools of financial economics. These projects have often a significant empirical component that uses large financial datasets and econometric tools. Of late, I have been working on several projects related to the issue of equity and inclusion in financial markets. These papers use large datasets from financial markets to understand differences in the quantity and quality of financial services received by minority borrowers. A common theme across these projects is the issue of causal inference using state-of-the art tools from econometrics. Finally, some of ongoing research projects are related to FinTech with a focus on credit scoring and online lending.
My research is at the intersection of neuroscience and artificial intelligence. My group uses neuroscience or brain-inspired principles to design models and algorithms for computer vision and language processing. In turn, we uses neural network models to test hypotheses in neuroscience and explain or predict human perception and behaviors. My group also develops and uses machine learning algorithms to improve the acquisition and analysis of medical images, including functional magnetic resonance imaging of the brain and magnetic resonance imaging of the gut.
We use brain-inspired neural networks models to predict and decode brain activity in humans processing information from naturalistic audiovisual stimuli.
Alex Gorodetsky’s research is at the intersection of applied mathematics, data science, and computational science, and is focused on enabling autonomous decision making under uncertainty. He is especially interested in controlling, designing, and analyzing autonomous systems that must act in complex environments where observational data and expensive computational simulations must work together to ensure objectives are achieved. Toward this goal, he pursues research in wide-ranging areas including uncertainty quantification, statistical inference, machine learning, control, and numerical analysis. His methodology is to increase scalability of probabilistic modeling and analysis techniques such as Bayesian inference and uncertainty quantification. His current strategies to achieving scalability revolve around leveraging computational optimal transport, developing tensor network learning algorithms, and creating new multi-fidelity information fusion approaches.
Sample workflow for enabling autonomous decision making under uncertainty for a drone operating in a complex environment. We develop algorithms to compress simulation data by exploiting problem structure. We then embed the compressed representations onto onboard computational resources. Finally, we develop approaches to enable the drone to adapt, learn, and refine knowledge by interacting with, and collecting data from, the environment.