About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems. Changes:This release adds a bunch of new image processing routines as well as many minor usability improvements and bug fixes.
|
About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models. Changes:Logdet-estimation functionality for grid-based approximate covariances
More generic infEP functionality
New infKL function contributed by Emtiyaz Khan and Wu Lin
Time-series covariance functions on the positive real line
New covariance functions
|
About: A Java Toolbox for Scalable Probabilistic Machine Learning. Changes:
Detailed information can be found in the toolbox's web page
|
About: An extensible C++ library of Hierarchical Bayesian clustering algorithms, such as Bayesian Gaussian mixture models, variational Dirichlet processes, Gaussian latent Dirichlet allocation and more. Changes:New maximum cluster argument for all algorithms. Also no more matlab interface since it seemed no one was using it, and I cannot support it any longer.
|
About: The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, sum-product networks, arithmetic circuits, and mixtures of trees. Changes:Version 1.1.2d (12/29/2015):
|
About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields. Changes:Release 0.3.2 fixes various bugs and adds GLC (Generalized Loop Corrections) written by Siamak Ravanbakhsh.
|
About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation. Changes:Initial Announcement on mloss.org.
|
About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference. Changes:added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used
|
About: Data-efficient policy search framework using probabilistic Gaussian process models Changes:Initial Announcement on mloss.org.
|
About: Gaussian process RTS smoothing (forward-backward smoothing) based on moment matching. Changes:Initial Announcement on mloss.org.
|
About: Bayesian Reasoning and Machine Learning toolbox Changes:Fixed some small bugs and updated some demos.
|
About: Matlab implementation of variational gaussian approximate inference for Bayesian Generalized Linear Models. Changes:Code restructure and bug fix.
|
About: The library is focused on implementation of propagation based approximate inference methods. Also implemented are a clique tree based exact inference, Gibbs sampling, and the mean field algorithm. Changes:Initial Announcement on mloss.org.
|
About: stroll (STRuctured Output Learning Library) is a library for Structured Output Learning. Changes:Initial Announcement on mloss.org.
|