About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code. Changes:In addition to minor improvements and bug fixes, this release includes:
Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.2.2!
|
About: Obandit is an Ocaml module for multi-armed bandits. It supports the EXP, UCB and Epsilon-greedy family of algorithms. Changes:Initial Announcement on mloss.org.
|
About: A Matlab benchmarking toolbox for online and adaptive regression with kernels. Changes:
|
About: General purpose Java Machine Learning library for classification, regression, and clustering. Changes:See github release tab for change info
|
About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis. Changes:Initial Announcement on mloss.org.
|
About: Incremental (Online) Nonparametric Classifier. You can classify both points (standard) or matrices (multivariate time series). Java and Matlab code already available. Changes:version 2: parameterless system, constant model size, prediction confidence (for active learning). NEW!! C++ version at: https://github.com/ilaria-gori/ABACOC
|
About: Hivemall is a scalable machine learning library running on Hive/Hadoop. Changes:
|
About: The implementation of adaptive probabilistic mappings. Changes:Initial Announcement on mloss.org.
|
About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license. Changes:New version November 2013
|
About: LIBOL is an open-source library with a family of state-of-the-art online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification. Changes:In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows: • Add a template and guide for adding new algorithms; • Improve parameter settings and make documentation clear; • Improve documentation on data formats and key functions; • Amend the "OGD" function to use different loss types; • Fixed some name inconsistency and other minor bugs.
|
About: Jubatus is a general framework library for online and distributed machine learning. It currently supports classification, regression, clustering, recommendation, nearest neighbors, anomaly detection, and graph analysis. Loose model sharing provides higher scalability, better performance, and real-time capabilities, by combining online learning with distributed computations. Changes:0.5.0 add new supports for clustering and nearest neighbors. For more detail, see http://t.co/flMcTcYZVs
|
About: This package contains a python and a matlab implementation of the most widely used algorithms for multi-armed bandit problems. The purpose of this package is to provide simple environments for comparison and numerical evaluation of policies. Changes:Initial Announcement on mloss.org.
|
About: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its [...] Changes:Version 1.2.4
|
About: OpenViBE is an opensource platform that enables to design, test and use Brain-Computer Interfaces (BCI). Broadly speaking, OpenViBE can be used in many real-time Neuroscience applications [...] Changes:New release 0.8.0.
|
About: A fast implementation of several stochastic gradient descent learners for classification, ranking, and ROC area optimization, suitable for large, sparse data sets. Includes Pegasos SVM, SGD-SVM, Passive-Aggressive Perceptron, Perceptron with Margins, Logistic Regression, and ROMMA. Commandline utility and API libraries are provided. Changes:Initial Announcement on mloss.org.
|
About: Elefant is an open source software platform for the Machine Learning community licensed under the Mozilla Public License (MPL) and developed using Python, C, and C++. We aim to make it the platform [...] Changes:This release contains the Stream module as a first step in the direction of providing C++ library support. Stream aims to be a software framework for the implementation of large scale online learning algorithms. Large scale, in this context, should be understood as something that does not fit in the memory of a standard desktop computer. Added Bundle Methods for Regularized Risk Minimization (BMRM) allowing to choose from a list of loss functions and solvers (linear and quadratic). Added the following loss classes: BinaryClassificationLoss, HingeLoss, SquaredHingeLoss, ExponentialLoss, LogisticLoss, NoveltyLoss, LeastMeanSquareLoss, LeastAbsoluteDeviationLoss, QuantileRegressionLoss, EpsilonInsensitiveLoss, HuberRobustLoss, PoissonRegressionLoss, MultiClassLoss, WinnerTakesAllMultiClassLoss, ScaledSoftMarginMultiClassLoss, SoftmaxMultiClassLoss, MultivariateRegressionLoss Graphical User Interface provides now extensive documentation for each component explaining state variables and port descriptions. Changed saving and loading of experiments to XML (thereby avoiding storage of large input data structures). Unified automatic input checking via new static typing extending Python properties. Full support for recursive composition of larger components containing arbitrary statically typed state variables.
|
About: This package implements the “Online Random Forests” (ORF) algorithm of Saffari et al., ICCV-OLCV 2009. This algorithm extends the offline Random Forests (RF) to learn from online training data samples. ORF is a multi-class classifier which is able to learn the classifier without 1-vs-all or 1-vs-1 binary decompositions. Changes:Initial Announcement on mloss.org.
|
About: Reference implementation of the LASVM online and active SVM algorithms as described in the JMLR paper. The interesting bit is a small C library that implements the LASVM process and reprocess [...] Changes:Minor bug fix
|
About: Debellor is a scalable and extensible platform which provides common architecture for data mining and machine learning algorithms of various types. Changes:
|
About: OLaRankGreedy is an online solver of the dual formulation of support vector machines for sequence labeling using greedy inference. Changes:Initial Announcement on mloss.org.
|