LibSGDQN proposes an implementation of SGD-QN, a carefully designed quasi-Newton stochastic gradient descent solver for linear SVMs.
The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of second order information and splits the parameter update into independently scheduled components. Thanks to this design, SGD-QN iterates nearly as fast as a first order stochastic gradient descent but requires less iterations to achieve the same accuracy.
This algorithm is extensively described in the paper: "SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent" by A. Bordes, L. Bottou and P. Gallinari published in Journal of Machine Learning Research: Special Topic on Large Scale Learning (2009).
Along with SGD-QN, this library proposes the implementation of two other online solvers for linear SVMs (also discussed in the JMLR paper). A script to re-run the experiments of this paper is also provided.
- Changes to previous version:
Initial Announcement on mloss.org.
Leave a comment
You must be logged in to post comments.