Methods for scalable probabilistic inference

Dan Foreman-Mackey
Flatiron Institute

Most data analysis pipelines in astrophysics now have some steps that require detailed probabilistic modeling. As datasets get larger and our research questions get more ambitious, we are often pushing the limits of what our statistical frameworks are capable of. In this talk, I will discuss recent (and not so recent) developments in the field probabilistic programming that enable rigorous Bayesian inference with large datasets, and high-dimensional or computationally expensive models. In particular, I will highlight some scalable methods for time series analysis using Gaussian Processes, and some of the open source tools and computational techniques that have the potential to be broadly useful for accelerating inference in astrophysics.

Presentation (PDF File)

Back to Workshop III: Source inference and parameter estimation in Gravitational Wave Astronomy