Dr. Yi Lu Murphey is an Associate Dean for Graduate Education and Research, a Professor of the ECE(Electrical and Computer Engineering) department and the director of the Intelligent Systems Lab at the University of Michigan, Dearborn. She received a M.S. degree in computer science from Wayne State University, Detroit, Michigan, in 1983, and a Ph.D degree with a major in Computer Engineering and a minor in Control Engineering from the University of Michigan, Ann Arbor, Michigan, in 1989. Her current research interests are in the areas of machine learning, pattern recognition, computer vision and intelligent systems with applications to automated and connected vehicles, optimal vehicle power management, data analytics, and robotic vision systems. She has authored over 130 publications in refereed journals and conference proceedings. She is an editor for the Journal of Pattern Recognition, a senior life member of AAAI and a fellow of IEEE.
Kai S. Cortina, PhD, is Professor of Psychology in the College of Literature, Science, and the Arts at the University of Michigan, Ann Arbor.
Prof. Cortina’s major research revolves around the understanding of children’s and adolescents’ pathways into adulthood and the role of the educational system in this process. The academic and psycho-social development is analyzed from a life-span perspective exclusively analyzing longitudinal data over longer periods of time (e.g., from middle school to young adulthood). The hierarchical structure of the school system (student/classroom/school/district/state/nations) requires the use of statistical tools that can handle these kind of nested data.
Nils G. Walter, PhD, is the Francis S. Collins Collegiate Professor of Chemistry, Biophysics and Biological Chemistry, College of Literature, Science, and the Arts and Professor of Biological Chemistry, Medical School, at the University of Michigan, Ann Arbor.
Nature and Nanotechnology likewise employ nanoscale machines that self-assemble into structures of complex architecture and functionality. Fluorescence microscopy offers a non-invasive tool to probe and ultimately dissect and control these nanoassemblies in real-time. In particular, single molecule fluorescence resonance energy transfer (smFRET) allows us to measure distances at the 2-8 nm scale, whereas complementary super-resolution localization techniques based on Gaussian fitting of imaged point spread functions (PSFs) measure distances in the 10 nm and longer range. In terms of Big Data Analysis, we have developed a method for the intracellular single molecule, high-resolution localization and counting (iSHiRLoC) of microRNAs (miRNAs), a large group of gene silencers with profound roles in our body, from stem cell development to cancer. Microinjected, singly-fluorophore labeled, functional miRNAs are tracked at super-resolution within individual diffusing particles. Observed mobility and mRNA dependent assembly changes suggest the existence of two kinetically distinct assembly processes. We are currently feeding these data into a single molecule systems biology pipeline to bring into focus the unifying molecular mechanism of such a ubiquitous gene regulatory pathway. In addition, we are using cluster analysis of smFRET time traces to show that large RNA processing machines such as single spliceosomes – responsible for the accurate removal of all intervening sequences (introns) in pre-messenger RNAs – are working as biased Brownian ratchet machines. On the opposite end of the application spectrum, we utilize smFRET and super-resolution fluorescence microscopy to monitor enhanced enzyme cascades and nanorobots engineered to self-assemble and function on DNA origami.
Omid Dehzangi, PhD, is Assistant Professor of Computer and Information Science, College of Engineering and Computer Science, at the University of Michigan, Dearborn.
Wearable health technology is drawing significant attention for good reasons. The pervasive nature of such systems providing ubiquitous access to the continuous personalized data will transform the way people interact with each other and their environment. The resulting information extracted from these systems will enable emerging applications in healthcare, wellness, emergency response, fitness monitoring, elderly care support, long-term preventive chronic care, assistive care, smart environments, sports, gaming, and entertainment which create many new research opportunities and transform researches from various disciplines into data science which is the methodological terminology for data collection, data management, data analysis, and data visualization. Despite the ground-breaking potentials, there are a number of interesting challenges in order to design and develop wearable medical embedded systems. Due to limited available resources in wearable processing architectures, power-efficiency is demanded to allow unobtrusive and long-term operation of the hardware. Also, the data-intensive nature of continuous health monitoring requires efficient signal processing and data analytic algorithms for real-time, scalable, reliable, accurate, and secure extraction of relevant information from an overwhelmingly large amount of data. Therefore, extensive research in their design, development, and assessment is necessary. Embedded Processing Platform Design The majority of my work concentrates on designing wearable embedded processing platforms in order to shift the conventional paradigms from hospital-centric healthcare with episodic and reactive focus on diseases to patient-centric and home-based healthcare as an alternative segment which demands outstanding specialized design in terms of hardware design, software development, signal processing and uncertainty reduction, data analysis, predictive modeling and information extraction. The objective is to reduce the costs and improve the effectiveness of healthcare by proactive early monitoring, diagnosis, and treatment of diseases (i.e. preventive) as shown in Figure 1.
One of my research interests is in the digital diagnosis of material damage based on sensors, computational science and numerical analysis with large-scale 3D computed tomography data: (1) Establishment of a multi-resolution transformation rule of material defects. (2) Design of an accurate digital diagnosis method for material damage. (3) Reconstruction of defects in material domains from X-ray CT data . (4) Parallel computation of materials damage. My team also conducted a series of studies for improving the quality of large-scale laser scanning data in reverse engineering and industrial inspection: (1) Detection and removal of non-isolated Outlier Data Clusters (2) Accurate correction of surface data noise of polygonal meshes (3) Denoising of two-dimensional geometric discontinuities.
Another research focus is on the information fusion of large-scale data from autonomous driving. Our research is funded by China Natural Science Foundation with focus on (1) laser-based perception in degraded visual environment, (2) 3D pattern recognition with dynamic, incomplete, noisy point clouds, (3) real-time image processing algorithms in degraded visual environment, and (4) brain-computer interface to predict the state of drivers.
Elizaveta (Liza) Levina and her group work on various questions arising in the statistical analysis of large and complex data, especially networks and graphs. Our current focus is on developing rigorous and computationally efficient statistical inference on realistic models for networks. Current directions include community detection problems in networks (overlapping communities, networks with additional information about the nodes and edges, estimating the number of communities), link prediction (networks with missing or noisy links, networks evolving over time), prediction with data connected by a network (e.g., the role of friendship networks in the spread of risky behaviors among teenagers), and statistical analysis of samples of networks with applications to brain imaging, especially fMRI data from studies of mental health).
Our lab’s research interests are in the areas of oncology bioinformatics, multimodality image analysis, and treatment outcome modeling. We operate at the interface of physics, biology, and engineering with the primary motivation to design and develop novel approaches to unravel cancer patients’ response to chemoradiotherapy treatment by integrating physical, biological, and imaging information into advanced mathematical models using combined top-bottom and bottom-top approaches that apply techniques of machine learning and complex systems analysis to first principles and evaluating their performance in clinical and preclinical data. These models could be then used to personalize cancer patients’ chemoradiotherapy treatment based on predicted benefit/risk and help understand the underlying biological response to disease. These research interests are divided into the following themes:
- Bioinformatics: design and develop large-scale datamining methods and software tools to identify robust biomarkers (-omics) of chemoradiotherapy treatment outcomes from clinical and preclinical data.
- Multimodality image-guided targeting and adaptive radiotherapy: design and develop hardware tools and software algorithms for multimodality image analysis and understanding, feature extraction for outcome prediction (radiomics), real-time treatment optimization and targeting.
- Radiobiology: design and develop predictive models of tumor and normal tissue response to radiotherapy. Investigate the application of these methods to develop therapeutic interventions for protection of normal tissue toxicities.
Professor Balzano and her students investigate problems in statistical signal processing and optimization, particularly dealing with large and messy data. Her applications typically have missing, corrupted, and uncalibrated data as well as heterogeneous data in terms of sensors, sensor quality, and scale in both time and space. Her theoretical interests involve classes of non-convex problems that include Principal Components Analysis (or the Singular Value Decomposition) and many interesting variants such as PCA with sparse or structured principal components, orthogonality and non-negativity constraints, nonlinear variants such as low-dimensional algebraic variety models, and even categorical data or human preference data. She concentrates on fast gradient methods and related optimization methods that are scalable to real-time operation and massive data. Her work provides algorithmic and statistical guarantees for these algorithms on the aforementioned non-convex problems, and she focuses carefully on assumptions that are realistic for the relevant applications. She has worked in the areas of online algorithms, real-time computer vision, compressed sensing and matrix completion, network inference, and sensor networks.
Jerome P. Lynch, PhD, is Professor and Donald Malloure Department Chair of the Civil and Environmental Engineering Department in the College of Engineering in the University of Michigan, Ann Arbor.
Prof. Lynch’s group works at the forefront of deploying large-scale sensor networks to the built environment for monitoring and control of civil infrastructure systems including bridges, roads, rail networks, and pipelines; this research portfolio falls within the broader class of cyber-physical systems (CPS). To maximize the benefit of the massive data sets, they collect from operational infrastructure systems, and undertake research in the area of relational and NoSQL database systems, cloud-based analytics, and data visualization technologies. In addition, their algorithmic work is focused on the use of statistical signal processing, pattern classification, machine learning, and model inversion/updating techniques to automate the interrogation sensor data collected. The ultimate aim of Prof. Lynch’s work is to harness the full potential of data science to provide system users with real-time, actionable information obtained from the raw sensor data collected.