-
- Description:
Efficient implementation of Semi-Stochastic Gradient Descent algorithm (S2GD) for training logistic regression (L2-regularized).
The S2GD algorithm enjoys linear convergence (faster than SGD), and comes without the need of tuning parameters like stepsize (choice explained in paper).
The code is in C++, called from MATLAB. The code is well commented, and should be easily adjusted for different applications.
The package also contains implementation of SGD and SAG.
First time usage:
>> mexAll % compiles the .cpp files
>> demo
- Changes to previous version:
Initial Announcement on mloss.org.
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Agnostic
- Data Formats: Agnostic
- Tags: Stochastic Gradient Descent, Logistic Regression
- Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.