Project details for GPML Gaussian Processes for Machine Learning Toolbox

Screenshot JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.2

by hn - January 21, 2013, 15:34:50 CET [ Project Homepage BibTeX Download ]

view ( today), download ( today ), 0 subscriptions

OverallWhole StarWhole StarWhole StarWhole StarWhole Star
FeaturesWhole StarWhole StarWhole StarWhole StarWhole Star
UsabilityWhole StarWhole StarWhole StarWhole StarWhole Star
DocumentationWhole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)
Description:

The GPML toolbox implements approximate inference algorithms for Gaussian processes such as Expectation Propagation, the Laplace Approximation and Variational Bayes for a wide class of likelihood functions for both regression and classification. It comes with a big algebra of covariance and mean functions allowing for flexible modeling. The code is fully compatible to Octave 3.2.x.

Changes to previous version:

We now support inference on large datasets using the FITC approximation for non-Gaussian likelihoods for EP and Laplace's approximation. New likelihood functions: mixture likelihood, Poisson likelihood, label noise. We added two MCMC samplers.

BibTeX Entry: Download
Supported Operating Systems: Agnostic, Platform Independent
Data Formats: Matlab, Octave
Tags: Classification, Regression, Approximate Inference, Gaussian Processes
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.