Fast alternating direction methods for Optimization

Tom Goldstein
Rice University

Alternating direction methods are a commonplace tool for general mathematical programming and optimization. These methods have become particularly important in the field of variational image processing, which frequently requires the minimization of non-differentiable objectives. This paper considers accelerated (i.e. fast) variants of two common alternating direction methods: the Alternating Direction Method of Multipliers (ADMM) and the Alternating Minimization Algorithm (AMA). The proposed acceleration is of the form first proposed by Nesterov for gradient descent methods. In the case that the objective function is strongly convex, global convergence bounds are provided for both classical and accelerated variants of the methods. Numerical examples are presented to demonstrating the superior performance of the fast methods.


Back to Convex Relaxation Methods for Geometric Problems in Scientific Computing