In this talk we address the following basic problem. When computing a certain quantity of interest (or observable, performance measure, ... ) with a (design) probability model P, to ensure, one should compute the uncertainty level for the quantity of interest if we only assume that the true (but unattainable) model Q belongs to some neighborhood of the design model P. In this context we discuss new information inequalities based on the variational representation for the relative entropy (a.k.a Kullback-Leibler divergence) and for the Renyi divergences. Compared to classical information inequalities (a prototype being the Czisar-Kullback-Pinsker inequality) the new information inequalities are tight and they scale with time (for stochastic processes) and/or with space (for translation invariant interacting particle systems). These scaling properties allow to derive steady state uncertainty quantification bounds (even without detailed balance), as well uncertainty quantification bounds for the phase diagram of statistical mechanics models. Doing this we will uncover the deep connection between uncertainty quantification and the thermodynamic formalism: it turns out that to understand uncertainty quantification one needs to control risk-sensitive functionals (of free energy type) for the model P. We discuss as well how to make the bounds computable by using concentration inequalities. The Renyi divergence comes into play when considering rare events and we also derive information bounds for probability of rare events based on a new variational principle due to Paul Dupuis and collaborators.
Back to Workshop IV: Uncertainty Quantification for Stochastic Systems and Applications