Hamilton-Jacobi Equations, Mean-Field Games, and Optimal Control for Robust Machine Learning

Markos Katsoulakis
University of Massachusetts Amherst
Mathematics & Statistics

This talk explores the versatility of Hamilton-Jacobi (HJ) equations and mean-field games (MFGs) as a unifying framework for analyzing, designing, and improving the robustness of generative models. We show how major classes of flow- and diffusion-based models-including continuous-time normalizing flows, score-based models, and Wasserstein gradient flows-naturally arise from MFGs under varying particle dynamics and cost functions. The forward-backward PDE structure of MFGs enables the development of faster, data-efficient algorithms and provides new analytical tools for understanding robustness.

A key focus in this talk is uncertainty quantification (UQ). We prove a Wasserstein uncertainty propagation theorem showing that score-based generative models are robust to multiple sources of error, including discretization, score estimation, and model form uncertainty. We also present a parallel framework for Transformer architectures, formulating training as an optimal control problem with depth as time and loss as terminal cost. This leads to OT-Transformer, a plug-and-play architecture incorporating optimal transport regularization and yielding provable improvements in generalization, robustness, and efficiency.

These results are grounded in the regularity theory of HJ equations, which provides theoretical guarantees and practical insight into the stability of both generative models and Transformer architectures.


View on Youtube

Back to Sampling, Inference, and Data-Driven Physical Modeling in Scientific Machine Learning