Entropy-Based Ensemble Prediction Schemes

Greg Eyink
Johns Hopkins University

Statistical distributions of geophysical systems are often non-Gaussian (multi-modal, skewed, fat-tailed, etc.) However, many conventional ensemble prediction schemes based on Bayesian methods,
such as the Ensemble Kalman Filter, assume normal statistics when conditioning upon observations and this can lead to poor performance.
On the other hand, systematically converging ensemble schemes may require very large numbers of samples to properly represent non-Gaussian statistics, more than can be practically obtained for large-scale geophysical models. I shall introduce ensemble prediction methods that model system statistics as maximum-entropy/minimum
information distributions, that can be quite non-Gaussian. The basic idea is to represent system statistics by a distribution which, subject to moment constraints, minimizes the relative entropy or Kullback-Liebler information with respect to a model of the prior statistics. When the latter is taken to be a mixture of Gaussians,
this scheme gives a natural generalization of the Ensemble Kalman Filter for non-normal statistics. I'll discuss the method, its implementation and its advantages and disadvantages. Finally, I'll
present results for a simple stochastic PDE model of the ocean thermohaline circulation, which has bimodal statistics associated to two distinct stable states, and compare the maximum-entropy method in its cost and performance with other standard ensemble filtering methods.

Presentation (PDF File)

Back to Mathematical Issues and Challenges in Data Assimilation for Geophysical Systems: Interdisciplinary Perspectives