Project details for GPstuff

Screenshot JMLR GPstuff 4.2

by avehtari - June 17, 2013, 13:22:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view ( today), download ( today ), 0 subscriptions

OverallWhole StarWhole StarWhole StarWhole StarWhole Star
FeaturesWhole StarWhole StarWhole StarWhole StarWhole Star
UsabilityWhole StarWhole StarWhole StarWhole StarWhole Star
DocumentationWhole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)
Description:

If you use GPstuff, please use the reference: Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari (2013). GPstuff: Bayesian Modeling with Gaussian Processes. In Journal of Machine Learning Research, 14:1175-1179.

See also user guide at http://arxiv.org/abs/1206.5754

GPstuff is a toolbox for Bayesian Modeling with Gaussian Processes with following features and more:

  • Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
    • Sums, products and scaling of covariance functions
    • Euclidean and delta distance
  • Several mean functions with marginalized parameters
  • Several likelihood/observation models
    • Continuous observations: Gaussian, Gaussian scale mixture (MCMC only), Student's-t, quantile regression
    • Classification: Logit, Probit, multinomial logit (softmax), multinomial probit
    • Count data: Binomial, Poisson, (Zero truncated) Negative-Binomial, Hurdle model, Zero-inflated Negative-Binomial, Multinomial
    • Survival: Cox-PH, Weibull, log-Gaussian, log-logistic
    • Point process: Log-Gaussian Cox process
    • Density estimation and regression: logistic GP
    • Other: derivative observations (for sexp covariance function only)
  • Hierarchical priors for hyperparameters
  • Sparse models
    • Sparse matrix routines for compactly supported covariance functions
    • Fully and partially independent conditional (FIC, PIC)
    • Compactly supported plus FIC (CS+FIC)
    • Variational sparse (VAR), Deterministic training conditional (DTC), Subset of regressors (SOR) (Gaussian/EP only)
    • PASS-GP
  • Latent inference
    • Exact (Gaussian only)
    • Laplace, Expectation propagation (EP), Parallel EP, Robust-EP
    • marginal posterior corrections (cm2 and fact)
    • Scaled Metropolis, Hamiltonian Monte Carlo (HMC), Scaled HMC, Elliptical slice sampling
  • Hyperparameter inference
    • Type II ML/MAP
    • Leave-one-out cross-validation (LOO-CV), Laplace/EP LOO-CV
    • Metropolis, HMC, No-U-Turn-Sampler (NUTS), Slice Sampling (SLS), Surrogate SLS, Shrinking-rank SLS, Covariance-matching SLS
    • Grid, CCD, Importance sampling
  • Model assessment
    • LOO-CV, Laplace/EP LOO-CV, IS-LOO-CV, k-fold-CV
    • WAIC, DIC
    • Average predictive comparison
Changes to previous version:

2013-06-14 Version 4.2

Improvements

  • Cross-validation much faster if no bias-corrections are needed (computes only the necessary predictions)
  • Marginal posterior corrections with loopred (Laplace) and cross-validation
  • More robust computation of marginal posterior corrections
  • More robust density estimation in lgpdens (default parameters changed)

Bug fixes

  • Mex files now in correct folders if compiled with SuiteSparse (covariance matrix computation now much faster)
  • Fixed bug with default marginal posterior correction when using gp_predcm
  • Fixed conditions in likelihood functions for grid approximation of predictions with marginal posterior corrections
  • Fixed outputs of gpmc_preds with multilatent models (thanks to Mahdi Biparva for pointing this out)
  • and some minor bug fixes
BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Agnostic, Platform Independent
Data Formats: Matlab, Octave
Tags: Classification, Regression, Machine Learning, Nonparametric Bayes, Gaussian Process, Bayesian Inference
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.