About: Armadillo is a template C++ linear algebra library aiming towards a good balance between speed and ease of use, with a function syntax similar to MATLAB. Matrix decompositions are provided through optional integration with LAPACK, or one of its high performance dropin replacements (eg. Intel MKL, OpenBLAS). Changes:

About: Somoclu is a massively parallel implementation of selforganizing maps. It relies on OpenMP for multicore execution, MPI for distributing the workload, and it can be accelerated by CUDA on a GPU cluster. A sparse kernel is also included, which is useful for training maps on vector spaces generated in text mining processes. Apart from a command line interface, Python, R, and MATLAB are supported. Changes:

About: DiffSharp is an automatic differentiation (AD) library providing gradients, Hessians, Jacobians, directional derivatives, and matrixfree Hessian and Jacobianvector products. It allows exact and efficient calculation of derivatives, with support for nesting. Changes:Version 0.7.0 is a reimplementation of the library with support for linear algebra primitives, BLAS/LAPACK, 32 and 64bit precision and different CPU/GPU backends Changed: Namespaces have been reorganized and simplified. This is a breaking change. There is now just one AD implementation, under DiffSharp.AD (with DiffSharp.AD.Float32 and DiffSharp.AD.Float64 variants, see below). This internally makes use of forward or reverse AD as needed. Added: Support for 32 bit (single precision) and 64 bit (double precision) floating point operations. All modules have Float32 and Float64 versions providing the same functionality with the specified precision. 32 bit floating point operations are significantly faster (as much as twice as fast) on many current systems. Added: DiffSharp now uses the OpenBLAS library by default for linear algebra operations. The AD operations with the types D for scalars, DV for vectors, and DM for matrices use the underlying linear algebra backend for highly optimized native BLAS and LAPACK operations. For nonBLAS operations (such as Hadamard products and matrix transpose), parallel implementations in managed code are used. All operations with the D, DV, and DM types support forward and reverse nested AD up to any level. This also paves the way for GPU backends (CUDA/CuBLAS) which will be introduced in following releases. Please see the documentation and API reference for information about how to use the D, DV, and DM types. (Deprecated: The FsAlg generic linear algebra library and the Vector<'T> and Matrix<'T> types are no longer used.) Fixed: Reverse mode AD has been reimplemented in a tailrecursive way for better performance and preventing StackOverflow exceptions encountered in previous versions. Changed: The library now uses F# 4.0 (FSharp.Core 4.4.0.0). Changed: The library is now 64 bit only, meaning that users should set "x64" as the platform target for all build configurations. Fixed: Overall bug fixes.

About: This code is provided by Jun Wan. It is used in the Chalearn oneshot learning gesture challenge (round 2). This code includes: bag of features, 3D MoSIFTbased features (i.e. 3D MoSIFT, 3D EMoSIFT and 3D SMoSIFT), and the MFSK feature. Changes:Initial Announcement on mloss.org.

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlablike development environment. Changes:

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...] Changes:In core weka:
In packages:

About: A platformindependent C++ framework for machine learning, graphical models, and computer vision research and development. Changes:Version 1.9:

About: The Java package jLDADMM is released to provide alternative choices for topic modeling on normal or short texts. It provides implementations of the Latent Dirichlet Allocation topic model and the onetopicperdocument Dirichlet Multinomial Mixture model (i.e. mixture of unigrams), using collapsed Gibbs sampling. In addition, jLDADMM supplies a document clustering evaluation to compare topic models. Changes:Initial Announcement on mloss.org.

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems. Changes:This release adds new clustering tools as well as upgrades the shape_predictor to allow training on datasets with missing landmarks. It also includes bug fixes and minor usability improvements.

About: OpenNN is an open source class library written in C++ programming language which implements neural networks, a main area of deep learning research. The library has been designed to learn from both data sets and mathematical models. Changes:New algorithms, correction of bugs.

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields. Changes:Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.

About: Incremental (Online) Nonparametric Classifier. You can classify both points (standard) or matrices (multivariate time series). Java and Matlab code already available. Changes:version 2: parameterless system, constant model size, prediction confidence (for active learning). NEW!! C++ version at: https://github.com/ilariagori/ABACOC

About: PCVM library a c++/armadillo implementation of the Probabilistic Classification Vector Machine. Changes:27.05.2015:  Matlab binding under Windows available. Added a solution file for VS'2013 express to compile a matlab mex binding. Can not yet confirm that under windows the code is really using multiple cores (under linux it does)

About: Jie Gui et al., "How to estimate the regularization parameter for spectral regression discriminant analysis and its kernel version?", IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 2, pp. 211223, 2014 Changes:Initial Announcement on mloss.org. 
About: Jie Gui, Zhenan Sun, Guangqi Hou, Tieniu Tan, "An optimal set of code words and correntropy for rotated least squares regression", International Joint Conference on Biometrics, 2014, pp. 16 Changes:Initial Announcement on mloss.org.

About: xgboost: eXtreme Gradient Boosting It is an efficient and scalable implementation of gradient boosting framework. The package includes efficient linear model solver and tree learning algorithm. The package can automatically do parallel computation with OpenMP, and it can be more than 10 times faster than existing gradient boosting packages such as gbm or sklearn.GBM . It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that user are also allowed to define there own objectives easily. The newest version of xgboost now supports distributed learning on various platforms such as hadoop, mpi and scales to even larger problems Changes:

About: This MATLAB package provides the LOMO feature extraction and the XQDA metric learning algorithms proposed in our CVPR 2015 paper. It is fast, and effective for person reidentification. For more details, please visit http://www.cbsr.ia.ac.cn/users/scliao/projects/lomo_xqda/. Changes:Initial Announcement on mloss.org.

About: Bayesian Logic (BLOG) is a probabilistic modeling language. It is designed for representing relations and uncertainties among real world objects. Changes:Initial Announcement on mloss.org.

About: FsAlg is a linear algebra library that supports generic types. Changes:Initial Announcement on mloss.org.
