Optimization techniques for Support Vector Machines

Olivier Chapelle

Yahoo! Research

Abstract

I will discuss some optimization methods for Support Vector Machines (SVMs) and related algorithms with a particular emphasis on large scale methods. Some standard algorithms such as conjugate gradient, Newton or stochastic gradient descent turn out to be very efficient in the context of primal SVM training. I will also present semi-supervised algorithms with application to web spam detection and a structured output learning algorithm applied to ranking. Finally, we will see how these algorithms can easily be extended to non-linear functions through functional gradient boosting.