-
- Description:
If you use GPstuff, please use the reference: Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari (2013). GPstuff: Bayesian Modeling with Gaussian Processes. In Journal of Machine Learning Research, 14:1175-1179.
See also user guide at http://arxiv.org/abs/1206.5754
GPstuff is a toolbox for Bayesian Modeling with Gaussian Processes with following features and more:
-
Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
- Sums, products and scaling of covariance functions
- Euclidean and delta distance
- Several mean functions with marginalized parameters
-
Several likelihood/observation models
- Continuous observations: Gaussian, Gaussian scale mixture (MCMC only), Student's-t, quantile regression
- Classification: Logit, Probit, multinomial logit (softmax), multinomial probit
- Count data: Binomial, Poisson, (Zero truncated) Negative-Binomial, Hurdle model, Zero-inflated Negative-Binomial, Multinomial
- Survival: Cox-PH, Weibull, log-Gaussian, log-logistic
- Point process: Log-Gaussian Cox process
- Density estimation and regression: logistic GP
- Other: derivative observations (for sexp covariance function only)
- Monotonicity information
- Hierarchical priors for hyperparameters
-
Sparse models
- Sparse matrix routines for compactly supported covariance functions
- Fully and partially independent conditional (FIC, PIC)
- Compactly supported plus FIC (CS+FIC)
- Variational sparse (VAR), Deterministic training conditional (DTC), Subset of regressors (SOR) (Gaussian/EP only)
- PASS-GP
-
Latent inference
- Exact (Gaussian only)
- Laplace, Expectation propagation (EP), Parallel EP, Robust-EP
- marginal posterior corrections (cm2 and fact)
- Scaled Metropolis, Hamiltonian Monte Carlo (HMC), Scaled HMC, Elliptical slice sampling
- State space inference (1D for some covariance functions)
-
Hyperparameter inference
- Type II ML/MAP
- Leave-one-out cross-validation (LOO-CV), Laplace/EP LOO-CV
- Metropolis, HMC, No-U-Turn-Sampler (NUTS), Slice Sampling (SLS), Surrogate SLS, Shrinking-rank SLS, Covariance-matching SLS
- Grid, CCD, Importance sampling
-
Model assessment
- LOO-CV, Laplace/EP LOO-CV, IS-LOO-CV, k-fold-CV
- WAIC, DIC
- Average predictive comparison
-
Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
- Changes to previous version:
2014-04-11 Version 4.4
New features
Monotonicity constraint for the latent function.
- Riihimäki and Vehtari (2010). Gaussian processes with monotonicity information. Journal of Machine Learning Research: Workshop and Conference Proceedings, 9:645-652.
State space implementation for GP inference (1D) using Kalman filtering.
- For the following covariance functions: Squared-Exponential, Matérn-3/2 & 5/2, Exponential, Periodic, Constant
-
Särkkä, S., Solin, A., Hartikainen, J. (2013).
Spatiotemporal learning via infinite-dimensional Bayesian filtering and smoothing. IEEE Signal Processing Magazine, 30(4):51-61.
- Simo Sarkka (2013). Bayesian filtering and smoothing. Cambridge University Press.
- Solin, A. and Särkkä, S. (2014). Explicit link between periodic covariance functions and state space models. AISTATS 2014.
Improvements
- GP_PLOT function for quick plotting of GP predictions
- GP_IA now warns if it detects multimodal posterior distributions
- much faster EP with log-Gaussian likelihood (numerical integrals -> analytical results)
- faster WAIC with GP_IA array (numerical integrals -> analytical results)
-
New demos demonstrating new features etc.
- demo_minimal, minimal demo for regression and classification
- demo_kalman1, demo_kalman2
- demo_monotonic, demo_monotonic2
Plus bug fixes
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Agnostic, Platform Independent
- Data Formats: Matlab, Octave
- Tags: Classification, Regression, Machine Learning, Nonparametric Bayes, Gaussian Process, Bayesian Inference
- Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.