About: RLLib is a lightweight C++ template library that implements incremental, standard, and gradient temporaldifference learning algorithms in Reinforcement Learning. It is an optimized library for robotic applications and embedded devices that operates under fast duty cycles (e.g., < 30 ms). RLLib has been tested and evaluated on RoboCup 3D soccer simulation agents, physical NAO V4 humanoid robots, and Tiva C series launchpad microcontrollers to predict, control, learn behaviors, and represent learnable knowledge. The implementation of the RLLib library is inspired by the RLPark API, which is a library of temporaldifference learning algorithms written in Java. Changes:Current release version is v2.0.

About: Loglinear analysis for highdimensional data Changes:Initial Announcement on mloss.org.

About: MOSIS is a modularized framework for signal processing, stream analysis, machine learning and stream mining applications. Changes:

About: The package computes the optimal parameters for the Choquet kernel Changes:Initial Announcement on mloss.org.

About: "Ordinal Choquistic Regression" model using the maximum likelihood Changes:Initial Announcement on mloss.org.

About: minFunc is a Matlab function for unconstrained optimization of differentiable realvalued multivariate functions using linesearch methods. It uses an interface very similar to the Matlab Optimization Toolbox function fminunc, and can be called as a replacement for this function. On many problems, minFunc requires fewer function evaluations to converge than fminunc (or minimize.m). Further it can optimize problems with a much larger number of variables (fminunc is restricted to several thousand variables), and uses a line search that is robust to several common function pathologies. Changes:Initial Announcement on mloss.org.

About: LIBOL is an opensource library with a family of stateoftheart online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification. Changes:In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows: • Add a template and guide for adding new algorithms; • Improve parameter settings and make documentation clear; • Improve documentation on data formats and key functions; • Amend the "OGD" function to use different loss types; • Fixed some name inconsistency and other minor bugs.

About: The glmie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glmie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference. Changes:added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes generalised nonGaussian potentials so that affine instead of linear functions of the latent variables can be used

About: FABIA is a biclustering algorithm that clusters rows and columns of a matrix simultaneously. Consequently, members of a row cluster are similar to each other on a subset of columns and, analogously, members of a column cluster are similar to each other on a subset of rows. Biclusters are found by factor analysis where both the factors and the loading matrix are sparse. FABIA is a multiplicative model that extracts linear dependencies between samples and feature patterns. Applications include detection of transcriptional modules in gene expression data and identification of haplotypes/>identity by descent< consisting of rare variants obtained by next generation sequencing. Changes:CHANGES IN VERSION 2.8.0NEW FEATURES
CHANGES IN VERSION 2.4.0
CHANGES IN VERSION 2.3.1NEW FEATURES
2.0.0:
1.4.0:

About: MLlib provides a distributed machine learning (ML) library to address the growing need for scalable ML. MLlib is developed in Spark (http://spark.incubator.apache.org/), a cluster computing system designed for iterative computation. Moreover, it is a component of a larger system called MLbase (www.mlbase.org) that aims to provide userfriendly distributed ML functionality both for ML researchers and domain experts. MLlib currently consists of scalable implementations of algorithms for classification, regression, collaborative filtering and clustering. Changes:Initial Announcement on mloss.org.

About: This package includes implementations of the CCM, DMV and DMV+CCM parsers from Klein and Manning (2004), and code for testing them with the WSJ, Negra and Cast3LB corpuses (English, German and Spanish respectively). A detailed description of the parsers can be found in Klein (2005). Changes:Initial Announcement on mloss.org.

About: Test submission. Is MLOSS working? Changes:Initial Announcement on mloss.org.

About: CIlib is a library of computational intelligence algorithms and supporting components that allows simple extension and experimentation. The library is peer reviewed and is backed by a leading research group in the field. The library is under active development. Changes:Initial Announcement on mloss.org.

About: Stochastic neighbor embedding originally aims at the reconstruction of given distance relations in a lowdimensional Euclidean space. This can be regarded as general approach to multidimensional scaling, but the reconstruction is based on the definition of input (and output) neighborhood probability alone. The present implementation also allows for handling dissimilarity or scoreinduced neighborhood topologies and makes use of quasi 2nd order gradientbased (l)BFGS optimization. Changes:

About: The aim is to embed a given data relationship matrix into a lowdimensional Euclidean space such that the point distances / distance ranks correlate best with the original input relationships. Input relationships may be given as (sparse) (asymmetric) distance, dissimilarity, or (negative!) score matrices. Inputoutput relations are modeled as lowconditioned. (Weighted) Pearson and soft Spearman rank correlation, and unweighted soft Kendall correlation are supported correlation measures for input/output object neighborhood relationships. Changes:

About: The toolbox from the paper Nearoptimal Experimental Design for Model Selection in Systems Biology (Busetto et al. 2013, submitted) implemented in MATLAB. Changes:Initial Announcement on mloss.org.

About: BlockCoordinate FrankWolfe Optimization for Structural SVMs Changes:Initial Announcement on mloss.org.

About: This toolbox implements models for Bayesian mixedeffects inference on classification performance in hierarchical classification analyses. Changes:In addition to the existing MATLAB implementation, the toolbox now also contains an R package of the variational Bayesian algorithm for mixedeffects inference.

About: Python Framework for Vector Space Modelling that can handle unlimited datasets (streamed input, online algorithms work incrementally in constant memory). Changes:

About: PLEASD: A Matlab Toolbox for Structured Learning Changes:Initial Announcement on mloss.org.
