Augmented L1 and Nuclear-Norm Minimization with a Globally Linearly Convergent Algorithm

Wotao YinRice UniversityComputational and Applied Mathematics

L1 minimization tends to give sparse solutions while the least squares (LS) give dense solutions. We show that minimizing the weighted sum of L1 and LS, with an appropriately small weight for the LS term, can efficiently recover sparse vectors with provable recovery guarantees. For compressive sensing, exact and stable recovery guarantees can be given in terms of the null-space property, restricted isometry property, spherical section property, and “RIPless” property of the sensing matrix. Moreover, we show that the Lagrange dual problem of L1+LS minimization is convex, unconstrained, and differentiable; hence, a rich set of classical techniques such as gradient descent, line search, Barzilai-Borwein steps, quasi-Newton methods, and Nesterov’s acceleration can be directly applied. We show that the gradient descent iteration is globally linearly convergent, and we give an explicit rate. This is the first global linear convergence result among the gradient-based algorithms for sparse optimization. We also present an algorithm based on the limited-memory BFGS and demonstrate its superior performance than several existing L1 solvers. This is joint work with Ming-Jun Lai (U. Georgia at Athens) and Zhimin Peng (Rice).

Back to Advances in Scientific Computing, Imaging Science and Optimization: Stan Osher's 70th Birthday Conference