-
- Description:
SMIDAS is a C++ implementation of the stochastic mirror descent algorithm proposed in
* Shai Shalev-Shwartz and Ambuj Tewari, Stochastic methods for l1 regularized loss minimization. Proceedings of the 26th International Conference on Machine Learning, pages 929-936, 2009.
It can be used for l1-regularized loss minimization for both classification and regression problems.
Currently supported loss functions are the logistic loss, the hinge loss, and the squared loss [L(a,b) = (a-b)2]. SMIDAS is designed to run fast even for high-dimensional large datasets and can exploit the sparsity in the examples.
- Changes to previous version:
Initial announcement on mloss.org.
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Agnostic
- Data Formats: Ascii
- Tags: L1 Regularization, Large Datasets, Mirror Descent, Sparsity
- Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.