The field of optimization has recently been challenged by applications that require structured, approximate solutions, rather than the exact solutions that are the traditional goal of optimization algorithms. Instances of structure include sparsity of the solution vector (as occurs in compressed sensing and support vector machines), low matrix rank (as required in matrix completion, distance matrix estimation, kernel regularization), and low total variation, as needed in image processing applications. Structured solutions can be obtained in some cases by modifying the optimization formulation, for example by adding regularization terms and additional constraints. The algorithms that are appropriate to solve these modified formulations may be quite different from those that work for the original formulations. This happens in part because the regularization terms frequently introduce nonsmoothness, and because highly accurate solutions (even of the regularized formulation) may not be needed by the application. Add to these factors the large size of such applications and the frequent need to solve them in real time, and we have a significant challenge to current optimization methodology.
This workshop brings together experts on techniques that are currently being used (or that could potentially be used) to solve sparse/structured problems and other problem classes of recent interest. We mention in particular techniques for conic optimization formulations (which have applications also in robust optimization), fast gradient and subgradient methods, stochastic approximation techniques, and semismooth Newton and other methods that use second-order information. The workshop will also involve nonlinear programming researchers, with a view to making tighter connections between recent research in that area and the emerging paradigms discussed above.