Nonsmooth Nonconvex Optimization in Machine Learning and Data Mining

Kristin Bennett
Rensselaer Polytechnic Institute

Many machine learning and data mining problems can be expressed as optimization models. To address problems on massive datasets and in real-time settings, scalable high performance algorithms are required to solve these models. Recently, nonsmooth optimization methods have led to massively scalable algorithms for solution of convex machine learning models such as Support Vector Machines. Nonconvex models can be used to elegantly capture learning tasks such as model selection within support vector machines and multiple instance learning, and visualization tasks such as annotated graph drawing with minimal edge crossings. This talk examines how novel nonsmooth nonconvex optimization methods can be analogously used to create scalable algorithms for solving these nonconvex machine learning and data mining models. Results are illustrated on compelling real-world problems in drug discovery and tuberculosis tracking and control.


Back to Workshop V: Applications of Optimization in Science and Engineering