Project details for GPML Gaussian Processes for Machine Learning Toolbox

Screenshot JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.0

by hn - October 19, 2016, 10:15:05 CET [ Project Homepage BibTeX Download ]

view ( today), download ( today ), 0 subscriptions

OverallWhole StarWhole StarWhole StarWhole StarWhole Star
FeaturesWhole StarWhole StarWhole StarWhole StarWhole Star
UsabilityWhole StarWhole StarWhole StarWhole StarWhole Star
DocumentationWhole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)
Description:

The GPML toolbox implements approximate inference algorithms for Gaussian processes such as Expectation Propagation, the Laplace Approximation and Variational Bayes for a wide class of likelihood functions for both regression and classification. It comes with a big algebra of covariance, likelihood, mean and hyperprior functions allowing for flexible modeling. The code is fully compatible to Octave 3.2.x.

Changes to previous version:

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

More generic sparse approximation using Power EP

  • unified treatment of FITC approximation, variational approaches VFE and hybrids

  • inducing input optimisation for all (compositions of) covariance functions dropping the previous limitation to a few standard examples

  • infFITC is now covered by the more generic infGaussLik function

Approximate covariance object unifying sparse approximations, grid-based approximations and exact covariance computations

  • implementation in cov/apx, cov/apxGrid, cov/apxSparse

  • generic infGaussLik unifies infExact, infFITC and infGrid

  • generic infLaplace unifies infLaplace, infFITC_Laplace and infGrid_Laplace

Hiearchical structure of covariance functions

  • clear hierachical compositional implementation

  • no more code duplication as present in covSEiso and covSEard pairs

  • two mother covariance functions

    • covDot for dot-product-based covariances and

    • covMaha for Mahalanobis-distance-based covariances

  • a variety of modifiers: eye, iso, ard, proj, fact, vlen

  • more flexibility as more variants are available and possible

  • all covariance functions offer derivatives w.r.t. inputs

Faster derivative computations for mean and cov functions

  • switched from partial derivatives to directional derivatives

  • simpler and more concise interface of mean and cov functions

  • much faster marginal likelihood derivative computations

  • simpler and more compact code

New mean functions

  • new mean/meanWSPC (Weighted Sum of Projected Cosines or Random Kitchen Sink features) following a suggestion by William Herlands

  • new mean/meanWarp for constructing a new mean from an existing one by means of a warping function adapted from William Herlands

New optimizer

  • added a new minimize_minfunc, contributed by Truong X. Nghiem

New GLM link function

  • added the twice logistic link function util/glm_invlink_logistic2

Smaller fixes

  • two-fold speedup of util/elsympol used by covADD by Truong X. Nghiem

  • bugfix in util/logphi as reported by John Darby

BibTeX Entry: Download
Supported Operating Systems: Agnostic, Platform Independent
Data Formats: Matlab, Octave
Tags: Classification, Regression, Approximate Inference, Gaussian Processes
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.