Projected Stein variational methods for high-dimensional Bayesian inversion constrained by large-scale PDEs

Peng Chen
University of Texas at Austin

In this talk, I will present two projected Stein variational methods--projected Stein variational gradient descent (pSVGD) and projected Stein variational Newton (pSVN) -- to solve high-dimensional Bayesian inverse problems. To address the curse of dimensionality, we exploit the intrinsic low-dimensional geometric structure of the posterior distribution in the high-dimensional parameter space via its gradient or Hessian (of the log posterior) operator and perform a parallel update of the parameter samples projected into a low-dimensional subspace by an SVGD or SVN method. The subspace is adaptively constructed using the eigenvectors of the averaged Fisher or Hessian matrix at the current samples. I will present error bounds for the projected posterior distribution measured in Kullback--Leibler divergence. Numerical experiments will be presented to demonstrate fast convergence of the proposed methods, complexity independent of the parameter dimension and the number of samples, strong parallel scalability in processor cores, and weak data scalability in data dimension. Time permits, I will talk about an integration of the Stein variational methods with an adaptive and goal-oriented reduced basis method to solve Bayesian inverse problems constrained by large-scale PDEs.

Presentation (PDF File)

Back to Workshop II: PDE and Inverse Problem Methods in Machine Learning