About: A native Python, scikitcompatible, implementation of a variety of multilabel classification algorithms. Changes:*kNN classifiers support sparse matrices properly support for the new model_selection API from scikitlearn extended graphbased label space clusteres to allow taking probability of a label occuring alone into consideration compatible with newest graphtool support the case when meka decides that an observation doesn't have any labels assigned HARAM classifier provided by Fernando Benitez from University of Konstanz predict_proba added to problem transformation classifiers ported to python 3

About: A library of scalable Bayesian generalised linear models with fancy features Changes:

About: RLScore  regularized leastsquares machine learning algorithms package Changes:Initial Announcement on mloss.org.

About: An extensible C++ library of Hierarchical Bayesian clustering algorithms, such as Bayesian Gaussian mixture models, variational Dirichlet processes, Gaussian latent Dirichlet allocation and more. Changes:New maximum cluster argument for all algorithms. Also no more matlab interface since it seemed no one was using it, and I cannot support it any longer.

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:

About: MALSS is a python module to facilitate machine learning tasks. Changes:Initial Announcement on mloss.org.

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning. Changes:Changelog pyGPs v1.3.2December 15th 2014

About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation. Changes:Initial Announcement on mloss.org.

About: pySPACE is the abbreviation for "Signal Processing and Classification Environment in Python using YAML and supporting parallelization". It is a modular software for processing of large data streams that has been specifically designed to enable distributed execution and empirical evaluation of signal processing chains. Various signal processing algorithms (so called nodes) are available within the software, from finite impulse response filters over datadependent spatial filters (e.g. CSP, xDAWN) to established classifiers (e.g. SVM, LDA). pySPACE incorporates the concept of node and node chains of the MDP framework. Due to its modular architecture, the software can easily be extended with new processing nodes and more general operations. Large scale empirical investigations can be configured using simple text configuration files in the YAML format, executed on different (distributed) computing modalities, and evaluated using an interactive graphical user interface. Changes:improved testing, improved documentation, windows compatibility, more algorithms

About: Crino: a neuralnetwork library based on Theano Changes:1.0.0 (7 july 2014) :  Initial release of crino  Implements a torchlike library to build artificial neural networks (ANN)  Provides standard implementations for : * autoencoders * multilayer perceptrons (MLP) * deep neural networks (DNN) * input output deep architecture (IODA)  Provides a batchgradient backpropagation algorithm, with adaptative learning rate

About: Embarrassingly Parallel Array Computing: EPAC is a machine learning workflow builder. Changes:Initial Announcement on mloss.org.

About: A library for artificial neural networks. Changes:Added algorithms:

About: A collection of python code to perform research in optimization. The aim is to provide reusable components that can be quickly applied to machine learning problems. Used in:  Ellipsoidal multiple instance learning  difference of convex functions algorithms for sparse classfication  Contextual bandits upper confidence bound algorithm (using GP)  learning output kernels, that is kernels between the labels of a classifier. Changes:

About: A python implementation of Breiman's Random Forests. Changes:Initial Announcement on mloss.org.

About: A chatterbot that learns natural languages learning from imitation. Changes:Alpha 1  Codename: Wendell Borton ("Bllluuhhhhh...!!") Short term memory greatly improved.
