Structured ML Training via Conditional Gradients

Sebastian Pokutta
Konrad-Zuse-Zentrum für Informationstechnik (ZIB)
Department of Mathematics

Conditional Gradient methods are an important class of methods to minimize (non-)smooth convex functions over (combinatorial) polytopes. Recently these methods received a lot of attention as they allow for structured optimization and hence learning, incorporating the underlying polyhedral structure into solutions. In this talk I will give a broad overview of these methods, their applications, as well as present some recent results both in traditional optimization and learning as well as in deep learning.

Presentation (PDF File)

Back to Long Programs