About: The Universal Java Matrix Package (UJMP) is a data processing tool for Java. Unlike JAMA and Colt, it supports multithreading and is therefore much faster on current hardware. It does not only support matrices with double values, but instead handles every type of data as a matrix through a common interface, e.g. CSV files, Excel files, images, WAVE audio files, tables in SQL data bases, and much more. Changes:Updated to version 0.3.0

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:

About: a parallel LDA learning toolbox in MultiCore Systems for big topic modeling. Changes:Initial Announcement on mloss.org.

About: A MATLAB toolbox for defining complex machine learning comparisons Changes:Initial Announcement on mloss.org.

About: pySPACE is the abbreviation for "Signal Processing and Classification Environment in Python using YAML and supporting parallelization". It is a modular software for processing of large data streams that has been specifically designed to enable distributed execution and empirical evaluation of signal processing chains. Various signal processing algorithms (so called nodes) are available within the software, from finite impulse response filters over datadependent spatial filters (e.g. CSP, xDAWN) to established classifiers (e.g. SVM, LDA). pySPACE incorporates the concept of node and node chains of the MDP framework. Due to its modular architecture, the software can easily be extended with new processing nodes and more general operations. Large scale empirical investigations can be configured using simple text configuration files in the YAML format, executed on different (distributed) computing modalities, and evaluated using an interactive graphical user interface. Changes:improved testing, improved documentation, windows compatibility, more algorithms

About: MSVMpack is a Multiclass Support Vector Machine (MSVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four MSVM models from the literature: Weston and Watkins MSVM, Crammer and Singer MSVM, Lee, Lin and Wahba MSVM, and the MSVM2 of Guermeur and Monfrini. Changes:

About: We introduces PLL, a parallel LDA learning toolbox for big topic modeling. Changes:Fix some compiling errors.

About: a dbms for resonating neural networks. Create and use different types of machine learning algorithms. Changes:AIML compatible (AIML files can be imported); new 'Grid channel' for developing board games; improved topics editor; new demo project: ALice (from AIML); lots of bugfixes and speed improvements

About: This archive contains a Matlab implementation of the Multilinear Principal Component Analysis (MPCA) algorithm and MPCA+LDA, as described in the paper Haiping Lu, K.N. Plataniotis, and A.N. Venetsanopoulos, "MPCA: Multilinear Principal Component Analysis of Tensor Objects", IEEE Transactions on Neural Networks, Vol. 19, No. 1, Page: 1839, January 2008. Changes:Initial Announcement on mloss.org.

About: Motivated by a need to classify highdimensional, heterogeneous data from the bioinformatics domain, we developed MLFlex, a machinelearning toolbox that enables users to perform twoclass and multiclass classiﬁcation analyses in a systematic yet ﬂexible manner. MLFlex was written in Java but is capable of interfacing with thirdparty packages written in other programming languages. It can handle multiple inputdata formats and supports a variety of customizations. MLFlex provides implementations of various validation strategies, which can be executed in parallel across multiple computing cores, processors, and nodes. Additionally, MLFlex supports aggregating evidence across multiple algorithms and data sets via ensemble learning. (See http://jmlr.csail.mit.edu/papers/volume13/piccolo12a/piccolo12a.pdf.) Changes:Initial Announcement on mloss.org.

About: Learns gradient boosted regression tree ensembles in parallel on shared memory or cluster systems Changes:Initial Announcement on mloss.org.

About: Implementation of LSTM for biological sequence analysis (classification, regression, motif discovery, remote homology detection). Additionally a LSTM as logistic regression with spectrum kernel is included. Changes:Spectrum LSTM package included

About: Pebl is a python library and command line application for learning the structure of a Bayesian network given prior knowledge and observations. Changes:Updated version to 1.0.1

About: dysii is a C++ library for distributed probabilistic inference and learning in largescale dynamical systems. It provides methods such as the Kalman, unscented Kalman, and particle filters and [...] Changes:Initial Announcement on mloss.org.

About: This is a C++ software designed to train largescale SVMs for binary classification. The algorithm is also implemented in parallel (**PGPDT**) for distributed memory, strictly coupled multiprocessor [...] Changes:Initial Announcement on mloss.org.
