Probabilistic operator learning: generative modeling and uncertainty quantification for in-context operator learning

Benjamin Zhang
Brown University
Division of Applied Mathematics

In-context operator networks (ICON) are a class of non-intrusive operator learning methods based on the novel architectures of foundation models. Trained on a diverse set of datasets of initial and boundary conditions paired with corresponding solutions to ordinary and partial differential equations (ODEs and PDEs), ICON learns to map example condition-solution pairs of a given differential equation to an approximation of its solution operator. In this talk, we present a probabilistic framework that reveals ICON as implicitly performing Bayesian inference, where it computes the mean of the posterior predictive distribution over solution operators conditioned on the provided context (example condition-solution pairs). By modeling the dependence of example pairs through random differential equations, we formalize how, given example condition-solution pairs, ICON is a point estimate of the posterior predictive distribution over operators. This probabilistic perspective provides a basis for extending ICON to generative settings, where one can sample from the posterior predictive distribution over solution operators. As a result, ICON is no longer limited to point prediction, as it can capture the underlying uncertainty in the solution operator. This enables principled uncertainty quantification in operator learning by using generative modeling to produce confidence intervals for predictive solutions.

Presentation (PDF File)
View on Youtube

Back to Long Programs