Optimization techniques for Support Vector Machines

Oliver Chapelle
Yahoo! Research

I will discuss some optimization methods for Support Vector Machines (SVMs) and related algorithms with a particular emphasis on large scale methods. Some standard algorithms such as conjugate gradient, Newton or stochastic gradient descent turn out to be very efficient in the context of primal SVM training. I will also present semi-supervised algorithms with application to web spam detection and a structured output learning algorithm applied to ranking. Finally, we will see how these algorithms can easily be extended to non-linear functions through functional gradient boosting.

Presentation (PDF File)

Back to Workshops II: Numerical Tools and Fast Algorithms for Massive Data Mining, Search Engines and Applications