The lectures will focus on variational problems that arise in machine learning. Modern data-acquisition techniques produce a wealth of data about our world. Extracting the information from the data leads to machine learning tasks such as clustering, classification, regression, dimensionality reduction, and others. These tasks are often modeled via functionals, defined on the available random sample, which specify the desired properties of the object sought.
The lectures will discuss a mathematical framework suitable for studies of asymptotic properties of such, variational, problems posed on random samples and related random geometries (e.g. proximity graphs). In particular we will discuss the passage from discrete variational problems based on random samples, as well as some PDE on graphs, to continuum limits.
The lectures will introduce the basic elements of the background material on calculus of variations and optimal transportation and establish he asymptotic consistency of several important machine learning algorithms.