About: The package provides a Lagrangian approach to the posterior regularization of given linear mappings. This is important in two cases, (a) when systems are underdetermined and (b) when the external model for calculating the mapping is invariant to properties such as scaling. The software may be applied in cases when the external model does not provide its own regularization strategy. In addition, the package allows to rank attributes according to their distortion potential to a given linear mapping. Changes:Version 1.1 (May 23, 2012) memory and time optimizations distderivrel.m now supports assessing the relevance of attribute pairs Version 1.0 (Nov 9, 2011) * Initial Announcement on mloss.org.

About: Fast C++ implementation of the variation of information (Meila 2003) and Rand index (Rand 1971) with MATLAB mex files Changes:Initial Announcement on mloss.org.

About: This archive contains a Matlab implementation of the Multilinear Principal Component Analysis (MPCA) algorithm and MPCA+LDA, as described in the paper Haiping Lu, K.N. Plataniotis, and A.N. Venetsanopoulos, "MPCA: Multilinear Principal Component Analysis of Tensor Objects", IEEE Transactions on Neural Networks, Vol. 19, No. 1, Page: 1839, January 2008. Changes:Initial Announcement on mloss.org.

About: Message passing for topic modeling Changes:

About: A Matlab script for learning vectorvalued functions and kernels on the output space. Changes:Added code for learning lowrank output kernels.

About: Gaussian process RTS smoothing (forwardbackward smoothing) based on moment matching. Changes:Initial Announcement on mloss.org.

About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software. Changes:Initial Announcement on mloss.org.

About: This package is a set of Matlab scripts that implements the algorithms described in the submitted paper: "LpLq Sparse Linear and Sparse Multiple Kernel MultiTask Learning". Changes:Initial Announcement on mloss.org.

About: Implementation of the multiassignment clustering method for Boolean vectors. Changes:new bib added

About: Matlab SVM toolbox for learning large margin filters in signal or images. Changes:Initial Announcement on mloss.org.

About: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its [...] Changes:Version 1.2.4

About: MATLAB toolbox for advanced BrainComputer Interface (BCI) research. Changes:Initial Announcement on mloss.org.

About: This is a Matlab/C++ "toolbox" of code for learning and inference with graphical models. It is focused on parameter learning using marginalization in the hightreewidth setting. Changes:Initial Announcement on mloss.org.

About: Multicore/distributed large scale machine learning framework. Changes:Update version.

About: FLANN is a library for performing fast approximate nearest neighbor searches in high dimensional spaces. It contains a collection of algorithms we found to work best for nearest neighbor search. Changes:See project page for changes.

About: Denoising images via normalized convolution Changes:Initial Announcement on mloss.org.

About: Multiclass vector classification based on cost functiondriven learning vector quantization , minimizing misclassification. Changes:Initial Announcement on mloss.org.

About: Bayesian Reasoning and Machine Learning toolbox Changes:Fixed some small bugs and updated some demos.

About: Correlative Matrix Mapping (CMM) provides a supervised linear data mapping into a Euclidean subspace of given dimension. Applications include denoising, visualization, labelspecific data preprocessing, and assessment of data attribute pairs relevant for the supervised mapping. Solving autoassociation problems yields linear multidimensional scaling, similar to PCA, but usually with more faithful lowdimensional mappings. Changes:Tue Jul 5 14:40:03 CEST 2011  Bugfixes and cleanups

About: A fast and scalable graphbased clustering algorithm based on the eigenvectors of the nonlinear 1Laplacian. Changes:
