I am a Research Fellow in the Inter-university Consortium for Political and Social Research (ICPSR) at the University of Michigan. My research is currently supported by a NSF project, Developing Evidence-based Data Sharing and Archiving Policies, where I am analyzing curation activities, automatically detecting data citations, and contributing to metrics for tracking the impact of data reuse. I hold a Ph.D. in Geography from UC Santa Barbara and I have expertise in GIScience, spatial information science, and urban planning. My interests also include the Semantic Web, innovative GIS education, and the science of science. I have experience deploying geospatial applications, designing linked data models, and developing visualizations to support data discovery.
Cyber-security is a complex and multi-dimensional research field. My research style comprises an inter-disciplinary (primarily rooted in economics, econometrics, data science (AI/ML/Bayesian and Frequentist Statistics), game theory, and network science) investigation of major socially pressing issues impacting the quality of cyber-risk management in modern networked and distributed engineering systems such as IoT-driven critical infrastructures, cloud-based service networks, and app-based systems (e.g., mobile commerce, smart homes) to name a few. I take delight in proposing data-driven, rigorous, and interdisciplinary solutions to both, existing fundamental challenges that pose a practical bottleneck to (cost) effective cyber-risk management, and futuristic cyber-security and privacy issues that might plague modern (networked) engineering systems. I strongly strive for originality, practical significance, and mathematical rigor in my solutions. One of my primary end goals is to conceptually get arms around complex, multi-dimensional information security and privacy problems in a way that helps, informs, and empowers practitioners and policy makers to take the right steps in making the cyber-space more secure.
The Aguilar group is focused understanding transcriptional and epigenetic mechanisms of skeletal muscle stem cells in diverse contexts such as regeneration after injury and aging. We focus on this area because there are little to no therapies for skeletal muscle after injury or aging. We use various types of in-vivo and in-vitro models in combination with genomic assays and high-throughput sequencing to study these molecular mechanisms.
My methodological research focus on developing statistical methods for routinely collected healthcare databases such as electronic health records (EHR) and claims data. I aim to tackle the unique challenges that arise from the secondary use of real-world data for research purposes. Specifically, I develop novel causal inference methods and semiparametric efficiency theory that harness the full potential of EHR data to address comparative effectiveness and safety questions. I develop scalable and automated pipelines for curation and harmonization of EHR data across healthcare systems and coding systems.
My research interests are in natural language semantics and psycholinguistics, focusing on verbs. I conduct behavioral psycholinguistic experiments with methodologies such as self-paced reading and maze tasks, as well as surveys of linguistic and semantic judgments. I also study semantic variation using corpora and datasets such as the Twitter Decahose, to better understand how words have developed diverging meanings in different communities, age groups, or regions. I use primarily R and Python to collect, manage, and analyze data. I direct the UM WordLab in the linguistics department, working with students (especially undergraduates) on experimental and computational research focusing on lexical representations.
Dr. Likosky is a Professor, Head of the Section of Health Services Research and Quality in the Department of Cardiac Surgery at Michigan Medicine and faculty member at the Center for Healthcare Outcomes and Policy. Dr. Likosky’s work currently focuses on leveraging: (i) mobile health technology to identify objective and scalable measures for mitigating post-surgical morbidities, and (ii) computer vision to identify objective and scalable measures of important intraoperative technical skills and non-technical practices.
We are interested in resolving outstanding fundamental scientific problems that impede the computational materials design process. Our group uses high-throughput density functional theory, applied thermodynamics, and materials informatics to deepen our fundamental understanding of synthesis-structure-property relationships, while exploring new chemical spaces for functional technological materials. These research interests are driven by the practical goal of the U.S. Materials Genome Initiative to accelerate materials discovery, but whose resolution requires basic fundamental research in synthesis science, inorganic chemistry, and materials thermodynamics.
We have developed and tested machine learning approaches to integrate quantitative markers for diagnosis and assessment of progression of TMJ OA, as well as extended the capabilities of 3D Slicer4 into web-based tools and disseminated open source image analysis tools. Our aims use data processing and in-depth analytics combined with learning using privileged information, integrated feature selection, and testing the performance of longitudinal risk predictors. Our long term goals are to improve diagnosis and risk prediction of TemporoMandibular Osteoarthritis in future multicenter studies.
The Spectrum of Data Science for Diagnosis of Osteoarthritis of the Temporomandibular Joint
As a board-certified ophthalmologist and glaucoma specialist, I have more than 15 years of clinical experience caring for patients with different types and complexities of glaucoma. In addition to my clinical experience, as a health services researcher, I have developed experience and expertise in several disciplines including performing analyses using large health care claims databases to study utilization and outcomes of patients with ocular diseases, racial and other disparities in eye care, associations between systemic conditions or medication use and ocular diseases. I have learned the nuances of various data sources and ways to maximize our use of these data sources to answer important and timely questions. Leveraging my background in HSR with new skills in bioinformatics and precision medicine, over the past 2-3 years I have been developing and growing the Sight Outcomes Research Collaborative (SOURCE) repository, a powerful tool that researchers can tap into to study patients with ocular diseases. My team and I have spent countless hours devising ways of extracting electronic health record data from Clarity, cleaning and de-identifying the data, and making it linkable to ocular diagnostic test data (OCT, HVF, biometry) and non-clinical data. Now that we have successfully developed such a resource here at Kellogg, I am now collaborating with colleagues at > 2 dozen academic ophthalmology departments across the country to assist them with extracting their data in the same format and sending it to Kellogg so that we can pool the data and make it accessible to researchers at all of the participating centers for research and quality improvement studies. I am also actively exploring ways to integrate data from SOURCE into deep learning and artificial intelligence algorithms, making use of SOURCE data for genotype-phenotype association studies and development of polygenic risk scores for common ocular diseases, capturing patient-reported outcome data for the majority of eye care recipients, enhancing visualization of the data on easy-to-access dashboards to aid in quality improvement initiatives, and making use of the data to enhance quality of care, safety, efficiency of care delivery, and to improve clinical operations. .
Dr. Douville is a critical care anesthesiologist with an investigative background in bioinformatics and perioperative outcomes research. He studies techniques for utilizing health care data, including genotype, to deliver personalized medicine in the perioperative period and intensive care unit. His research background has focused on ways technology can assist health care delivery to improve patient outcomes. This began designing microfluidic chips capable of recreating fluid mechanics of atelectatic alveoli and monitoring the resulting barrier breakdown real-time. His interest in bioinformatics was sparked when he observed how methodology designed for tissue engineering could be modified to the nano-scale to enable genomic analysis. Additionally, his engineering training provided the framework to apply data-driven modeling techniques, such as finite element analysis, to complex biological systems.