Math 288: Topics in Probability Theory - RANDOM WALKS, RANDOM MATRICES
Instructor: M. Maggioni

We will discuss topics at the intersection between random matrix theory,
graph theory, harmonic analysis and machine learning. The motivation will be
to study sets lying in high-dimensional spaces, but possibly of low
intrinsic dimension. Data sets, from a variety of applications, ranging from
text documents to images to financial transactions, may in many
circumstances be modeled as sampled from sets in high dimensions. When these
sets have certain geometric properties, such as low intrinsic dimension,
manifold or manifold-like structure, etc..., such properties may be
exploited to perform machine learning tasks, such as predicting the topic of
a document, recognizing a face from a picture, and so on. We explore the
connections between geometry, sampling (with noise), and approximation of
functions on the data. Random matrices arise naturally, for example as
covariance matrices of noisy data samples, and we will be discussing some of
the non-asymptotic random matrix theory, and apply it to the problem of
estimating covariances and intrinsic dimension of data sets. Graph theory
also has recently come to play a fundamental role in modeling the geometry
of data sets, and we will explore the connections between random walks on
graphs and the geometric properties of point clouds, via Laplacians,
eigenfunctions, heat kernels.