Probabilistic operator learning: generative modeling and uncertainty quantification for in-context operator learning

Benjamin Zhang
Brown University

In-context operator networks (ICON) are a class of non-intrusive operator learning methods based on the novel architectures of foundation models. ICON is trained on datasets consisting of initial and boundary conditions and the corresponding solution for a variety of ordinary (ODE) or partial differential equations (PDEs). The resulting network is an operator learner, in which new example initial/boundary condition-solution pairs are mapped to the solution operator of those examples. The resulting operator then takes new initial/boundary conditions to the solution of the learned operator. In contrast to traditional operator learning methods, ICON is able to approximate solutions of differential equations whose example condition-solution pairs are not explicitly given in the training data. Motivated by the capabilities of ICON and related foundation models for ODEs and PDEs, we establish a probabilistic framework for understanding ICON based on random differential equations. We show that ICON implicitly performs Bayesian inference, where it characterizes the mean of the distribution of solution operators given example condition-solution pairs, which is known as the posterior predictive distribution over the solution operators. This framework provides a basis for extending ICON to generative settings, in which we can generate from the posterior predictive. This probabilistic formulation enables uncertainty quantification for operator learning, which provides confidence in predicted solutions.


Back to Long Programs