Contributions to deep learning using a mathematical approach: improved model uncertainty, certified robust models, and faster training of Neural ODEs.

Adam Oberman
McGill University
Mathematics and Statistics

Deep learning is a hot area, but many of the results are empirical and short-lived. Experts in the area have asked for contributions from mathematics to bring some rigour to the area. In this talk I will describe three problems where a mathematical approach has made a contribution. The problems are: (i) deep model uncertainty, (ii) certified robust models, and (iii) neural ODEs. The math approaches are: statistical calibration, variational losses for Gaussian convolution, and regularization using Optimal Transportation, respectively. The talk will be geared to a broad audience: I’ll explain the problems in context, and present the ideas at a high level. Details can be found in the papers.

Presentation (PDF File)

Back to Long Programs