Project details for GPML Gaussian Processes for Machine Learning Toolbox

Screenshot JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ]

view ( today), download ( today ), 0 subscriptions

OverallWhole StarWhole StarWhole StarWhole StarWhole Star
FeaturesWhole StarWhole StarWhole StarWhole StarWhole Star
UsabilityWhole StarWhole StarWhole StarWhole StarWhole Star
DocumentationWhole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)
Description:

The GPML toolbox implements approximate inference algorithms for Gaussian processes such as Expectation Propagation, the Laplace Approximation and Variational Bayes for a wide class of likelihood functions for both regression and classification. It comes with a big algebra of covariance and mean functions allowing for flexible modeling. The code is fully compatible to Octave 3.2.x.

Changes to previous version:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp
BibTeX Entry: Download
Supported Operating Systems: Agnostic, Platform Independent
Data Formats: Matlab, Octave
Tags: Classification, Regression, Approximate Inference, Gaussian Processes
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.