Block multi-convex optimization

Wotao Yin
Rice University
Computational and Applied Mathematics

This talk considers block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. It also considers non-convex blocks and though requires these blocks to be updated by proximal minimization. After reviewing some interesting applications such as blind source separation, sparse dictionary learning, nonnegative matrix and tensor factorization, and low-rank matrix and tensor recovery, we propose a generalized block coordinate update method. While non-convex and/or nonsmooth functions can generally fail a block coordinate descent method, we establish the global convergence and asymptotic rate of convergence for our method applied to block multi-convex problems with separable nonsmooth terms. The analysis is based establishing the Kurdyka-Lojasiewicz inequality. We give a few large classes of functions that meet our assumptions. This is joint work with Yangyang Xu.

Presentation (PDF File)

Back to Convex Relaxation Methods for Geometric Problems in Scientific Computing