My current research is in the area of rational approximation in the complex domain. For example, I investigate the convergence of rational function series on the extended complex plane.
Prof. Zhang develops algebraic and geometric methods for data analysis. Algebraic methods are based on theories of topology and partially ordered sets (in particular lattice theory); an example being formal concept analysis (FCA). Geometric methods include Information Geometry, which studies the manifold of probability density functions. Zhang also has interests in machine learning, including reproducing kernel Banach space (RKBS) and reinforcement learning (RL).
Mahesh Agarwal is Associate Professor of Mathematics and Statistics at the University of Michigan, Dearborn.
Prof. Agarwal’s is primarily interested in number theory, in particular in p-adic L-functions, Bloch-Kato conjecture and automorphic forms. His secondary research interests are polynomials, geometry and math education.
My interests include randomized approximation algorithms for massive data sets, including, specifically, sublinear-time algorithms for sparse recovery in the Fourier and other domains. Other interests include data privacy, including privacy of energy usage data.
Prof. Vershynin’s main area of expertise is high dimensional probability and its applications. He is interested in random geometric structures that appear in various data science problems. The following is a sample of his recent projects: 1. High dimensional inference from nonlinear data Sometimes we are given certain observations of an unknown vector that encodes useful but hidden information, and we want to compute that vector. Examples includes compressed sensing, linear and non-linear regression, as well as binary (yes-no) observations. We are developing methods that can estimate the hidden vector without even knowing the nature of the non-linearity of observations. Areas of application include survey methodologies, signal processing, and various high-dimensional classification problems. 2. Structure mining in networks Complex data sets such as networks often have latent structures, for example clusters or communities. We are interested in developing efficient methods to discover such latent structures. Prof. Vershynin’s methods come from various areas of mathematics and data science, including random matrix theory, geometric functional analysis, convex and discrete geometry, geometric combinatorics, high dimensional statistics, information theory, learning theory, signal processing, theoretical computer science and numerical analysis.
My research interests include mathematical analysis, probability, networking, and algorithms. I am especially interested in randomized algorithms with applications to harmonic analysis, signal and image processing, computer networking, and massive datasets.