Project details for GPstuff

Screenshot JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view ( today), download ( today ), 0 subscriptions

OverallWhole StarWhole StarWhole StarWhole StarWhole Star
FeaturesWhole StarWhole StarWhole StarWhole StarWhole Star
UsabilityWhole StarWhole StarWhole StarWhole StarWhole Star
DocumentationWhole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)
Description:

If you use GPstuff, please use the reference: Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari (2013). GPstuff: Bayesian Modeling with Gaussian Processes. In Journal of Machine Learning Research, 14:1175-1179.

See also user guide at http://arxiv.org/abs/1206.5754

GPstuff is a toolbox for Bayesian Modeling with Gaussian Processes with following features and more:

  • Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
    • Sums, products and scaling of covariance functions
    • Euclidean and delta distance
  • Several mean functions with marginalized parameters
  • Several likelihood/observation models
    • Continuous observations: Gaussian, Gaussian scale mixture (MCMC only), Student's-t, quantile regression
    • Classification: Logit, Probit, multinomial logit (softmax), multinomial probit
    • Count data: Binomial, Poisson, (Zero truncated) Negative-Binomial, Hurdle model, Zero-inflated Negative-Binomial, Multinomial
    • Survival: Cox-PH, Weibull, log-Gaussian, log-logistic
    • Point process: Log-Gaussian Cox process
    • Density estimation and regression: logistic GP
    • Other: derivative observations (for sexp covariance function only)
    • Monotonicity information
  • Hierarchical priors for hyperparameters
  • Sparse models
    • Sparse matrix routines for compactly supported covariance functions
    • Fully and partially independent conditional (FIC, PIC)
    • Compactly supported plus FIC (CS+FIC)
    • Variational sparse (VAR), Deterministic training conditional (DTC), Subset of regressors (SOR) (Gaussian/EP only)
    • PASS-GP
  • Latent inference
    • Exact (Gaussian only)
    • Laplace, Expectation propagation (EP), Parallel EP, Robust-EP
    • marginal posterior corrections (cm2 and fact)
    • Scaled Metropolis, Hamiltonian Monte Carlo (HMC), Scaled HMC, Elliptical slice sampling
    • State space inference (1D for some covariance functions)
  • Hyperparameter inference
    • Type II ML/MAP
    • Leave-one-out cross-validation (LOO-CV), Laplace/EP LOO-CV
    • Metropolis, HMC, No-U-Turn-Sampler (NUTS), Slice Sampling (SLS), Surrogate SLS, Shrinking-rank SLS, Covariance-matching SLS
    • Grid, CCD, Importance sampling
  • Model assessment
    • LOO-CV, Laplace/EP LOO-CV, IS-LOO-CV, k-fold-CV
    • WAIC, DIC
    • Average predictive comparison
Changes to previous version:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes
BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Agnostic, Platform Independent
Data Formats: Matlab, Octave
Tags: Classification, Regression, Machine Learning, Nonparametric Bayes, Gaussian Process, Bayesian Inference
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.