Recent Advances in Alternating Direction Methods: Theory and Practice

Yin Zhang
Rice University
Computational and Applied Mathematics

The classic augmented Lagrangian Alternating Direction Method (ADM) has recently found great utilities in solving convex, separable optimization problems arising from signal/image processing and sparse optimization. In this talk, we first give some recent examples of ADM applications, including extensions to solving non-convex and non-separable problems. We then present new convergence results that extend the classic ADM convergence theory in several aspects.

Presentation (PDF File)

Back to Workshop II: Numerical Methods for Continuous Optimization