A primer on normalizing flows

Laurent Dinh
Google

Normalizing flows are an flexible family of probability distributions that can serve as generative models for a variety of data modalities. Because flows can be expressed as compositions of expressive functions, they have successfully harnessed recent advances in deep learning. An ongoing challenge in developing these methods is the definition of expressive yet tractable building blocks. In this talk, I will introduce the fundamentals and describe recent work (including my own) on this topic.

Presentation (PDF File)

Back to Workshop I: From Passive to Active: Generative and Reinforcement Learning with Physics