Support Vector Machines meet Goldfarb Cubes

Bernd Gärtner
ETH Zürich
Computer Science

Support vector machines are convex quadratic programs that are used in machine learning to classify data items into "positives" and "negatives", after a training phase with items whose classification is known. A parameter controls how accurately the training items are taken into account; in specific applications, the right choice of the parameter is often obtained by experiments.

In a seminal paper, Hastie et al. have suggested to facilitate such experiments by computing the entire solution path, i.e. the solution of the parameterized quadratic program as an explicit piecewise linear function of the parameter. The underlying empirical observation was that the solution path consists of a small number of pieces and can therefore be computed efficiently.

Hastie et al. in fact conjectured that the solution path of a support vector machine is always short. We falsify this conjecture by constructing a support vector machine with an exponentially long solution path. The construction is based on the good old Goldfarb cubes that can be interpreted as parameterized linear programs with exponentially long solution paths.


(joint work with Martin Jaggi and Clément Maria)

Presentation (PDF File)

Back to Efficiency of the Simplex Method: Quo vadis Hirsch conjecture?