First Order Algorithms for Convex Minimization

Marc Teboulle
Tel Aviv University
Mathematical Sciences

The gradient method is one of the oldest optimization algorithm, going back as early as 1847 with the initial work of Cauchy. Nowadays, gradient-based methods have attracted a revived and intensive interest both in the-oretical optimization and in scientific applications. Indeed, the very large-scale nature of problems arising in many scientific applications, combined
with an increase power of computer technology have motivated a return to the ”old and simple” first order methods that can overcome the curse of
dimensionality, a task which is usually out of reach for the current more so-phisticated polynomial time algorithms. For very large scale problems with medium accuracy requirements, gradient based methods often remain the only practical alternative.

This talk reviews basic results and recent advances in the design and analysis of gradient-based methods for a broad class of smooth and non-smooth convex minimization problems. Essential tools and key ideas needed
to build and analyze first order algorithms will be presented. We focus on optimization models/formulations; fundamental mathematical tools for convergence and complexity analysis, and on fast gradient schemes with improved complexity. Throughout the talk, this will be illustrated on various classes of optimization problems arising in diverse applications.

Presentation (PDF File)

Back to Optimization Tutorials