Project details for LIBOL

Logo LIBOL 0.1.0

by stevenhoi - December 27, 2012, 18:09:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (19 today), download ( 8 today ), 2 subscriptions

Description:

LIBOL is an open-source library for large-scale online classification, which consists of a large family of efficient and scalable state-of-the-art online learning algorithms for large-scale online classification tasks. We have offered easy-to-use command-line tools and examples for users and developers. We also have made documents available for both beginners and advanced users. LIBOL is not only a machine learning tool, but also a comprehensive experimental platform for conducting online learning research.

In general, the existing online learning algorithms for linear classication tasks can be grouped into two major categories: (i) first order learning (Rosenblatt, 1958; Crammer et al., 2006), and (ii) second order learning (Dredze et al., 2008; Wang et al., 2012; Yang et al., 2009).

Example online learning algorithms in the first order learning category implemented in this library include: • Perceptron: the classical online learning algorithm (Rosenblatt, 1958);

• ALMA: A New ApproximateMaximal Margin Classification Algorithm (Gentile, 2001);

• ROMMA: the relaxed online maxiumu margin algorithms (Li and Long, 2002);

• OGD: the Online Gradient Descent (OGD) algorithms (Zinkevich, 2003);

• PA: Passive Aggressive (PA) algorithms (Crammer et al., 2006), one of state-of-the-art first order online learning algorithms;

Example algorithms in the second order online learning category implemented in this library include the following: • SOP: the Second Order Perceptron (SOP) algorithm (Cesa-Bianchi et al., 2005);

• CW: the Confidence-Weighted (CW) learning algorithm (Dredze et al., 2008);

• IELLIP: online learning algorithms by improved ellipsoid method (Yang et al., 2009);

• AROW: the Adaptive Regularization of Weight Vectors (Crammer et al., 2009);

• NAROW: New variant of Adaptive Regularization (Orabona and Crammer, 2010);

• NHERD: the Normal Herding method via Gaussian Herding (Crammer and Lee, 2010)

• SCW: the recently proposed Soft ConfidenceWeighted algorithms (Wang et al., 2012).

LIBOL is still being improved by improvements from practical users and new research results.

More information can be found in our project website: http://libol.stevenhoi.org/

Changes to previous version:

Initial Announcement on mloss.org.

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Platform Independent
Data Formats: Svmlight, Libsvm, Any Format Supported By Matlab
Tags: Classification, Online Learning, Data Streams, Scalable Learning
Archive: download here

Other available revisons

Version Changelog Date
0.3.0

In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows:

• Add a template and guide for adding new algorithms;

• Improve parameter settings and make documentation clear;

• Improve documentation on data formats and key functions;

• Amend the "OGD" function to use different loss types;

• Fixed some name inconsistency and other minor bugs.

December 12, 2013, 15:26:14
0.2.3

In contrast to our last version (V0.2.0), the updated new version (V0.2.3) has made some important changes as follows:

• Make the library fully compatible with Octave;

• Include C/C++ function rand_c for random generator;

• Include arg_check to check argument validity;

• Fixed some other bugs.

September 23, 2013, 11:58:17
0.2.0

In contrast to our first version, the new version (V0.2.0) has made some important changes as follows:

• Support online multicclass classification;

• C/C++ implementation for core functions;

• 16 algorithms and variants for binary classification;

• 13 algorithms and variants for multiclass classification.

July 27, 2013, 10:15:45
0.1.0

Initial Announcement on mloss.org.

December 27, 2012, 18:09:54

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.