-
- Description:
If you use GPstuff, please use the reference: Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari (2013). GPstuff: Bayesian Modeling with Gaussian Processes. In Journal of Machine Learning Research, 14:1175-1179.
See also user guide at http://arxiv.org/abs/1206.5754
GPstuff is a toolbox for Bayesian Modeling with Gaussian Processes with following features and more:
-
Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
- Sums, products and scaling of covariance functions
- Euclidean and delta distance
- Several mean functions with marginalized parameters
-
Several likelihood/observation models
- Continuous observations: Gaussian, Gaussian scale mixture (MCMC only), Student's-t, quantile regression
- Classification: Logit, Probit, multinomial logit (softmax), multinomial probit
- Count data: Binomial, Poisson, (Zero truncated) Negative-Binomial, Hurdle model, Zero-inflated Negative-Binomial, Multinomial
- Survival: Cox-PH, Weibull, log-Gaussian, log-logistic
- Point process: Log-Gaussian Cox process
- Density estimation and regression: logistic GP
- Other: derivative observations (for sexp covariance function only)
- Monotonicity information
- Hierarchical priors for hyperparameters
-
Sparse models
- Sparse matrix routines for compactly supported covariance functions
- Fully and partially independent conditional (FIC, PIC)
- Compactly supported plus FIC (CS+FIC)
- Variational sparse (VAR), Deterministic training conditional (DTC), Subset of regressors (SOR) (Gaussian/EP only)
- PASS-GP
-
Latent inference
- Exact (Gaussian only)
- Laplace, Expectation propagation (EP), Parallel EP, Robust-EP
- marginal posterior corrections (cm2 and fact)
- Scaled Metropolis, Hamiltonian Monte Carlo (HMC), Scaled HMC, Elliptical slice sampling
- State space inference (1D for some covariance functions)
-
Hyperparameter inference
- Type II ML/MAP
- Leave-one-out cross-validation (LOO-CV), Laplace/EP LOO-CV
- Metropolis, HMC, No-U-Turn-Sampler (NUTS), Slice Sampling (SLS), Surrogate SLS, Shrinking-rank SLS, Covariance-matching SLS
- Grid, CCD, Importance sampling
-
Model assessment
- LOO-CV, Laplace/EP LOO-CV, IS-LOO-CV, k-fold-CV
- WAIC, DIC
- Average predictive comparison
-
Several covariance functions (e.g. squared exponential, exponential, Matérn, periodic and a compactly supported piece wise polynomial function)
- Changes to previous version:
2016-06-09 Version 4.7
Development and release branches available at https://github.com/gpstuff-dev/gpstuff
New features
- Simple Bayesian Optimization demo
Improvements
- Improved use of PSIS
- More options added to gp_monotonic
- Monotonicity now works for additive covariance functions with selected variables
- Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
- Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
- LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
- Selected variables -option works now better with monotonicity
Bugfixes
- small error in derivative observation computation fixed
- several minor bug fixes
- BibTeX Entry: Download
- Corresponding Paper BibTeX Entry: Download
- Supported Operating Systems: Agnostic, Platform Independent
- Data Formats: Matlab, Octave
- Tags: Classification, Regression, Machine Learning, Nonparametric Bayes, Gaussian Process, Bayesian Inference
- Archive: download here
Other available revisons
-
Version Changelog Date 4.7 2016-06-09 Version 4.7
Development and release branches available at https://github.com/gpstuff-dev/gpstuff
New features
- Simple Bayesian Optimization demo
Improvements
- Improved use of PSIS
- More options added to gp_monotonic
- Monotonicity now works for additive covariance functions with selected variables
- Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
- Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
- LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
- Selected variables -option works now better with monotonicity
Bugfixes
- small error in derivative observation computation fixed
- several minor bug fixes
June 9, 2016, 17:45:15 4.6 2015-07-09 Version 4.6
Development and release branches available at https://github.com/gpstuff-dev/gpstuff
New features
Use Pareto smoothed importance sampling (Vehtari & Gelman, 2015) for
importance sampling leave-one-out cross-validation (gpmc_loopred.m)
importance sampling integration over hyperparameters (gp_ia.m)
importance sampling part of the logistic Gaussian process density estimation (lgpdens.m)
references:
- Aki Vehtari and Andrew Gelman (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646.
- Aki Vehtari, Andrew Gelman and Jonah Gabry (2015). Efficient implementation of leave-one-out cross-validation and WAIC for evaluating fitted Bayesian models.
New covariance functions
- gpcf_additive creates a mixture over products of kernels for each dimension reference: Duvenaud, D. K., Nickisch, H., & Rasmussen, C. E. (2011). Additive Gaussian processes. In Advances in neural information processing systems, pp. 226-234.
- gpcf_linearLogistic corresponds to logistic mean function
- gpcf_linearMichelismenten correpsonds Michelis Menten mean function
Improvements - faster EP moment calculation for lik_logit
Several minor bugfixes
July 15, 2015, 15:08:06 4.5 2014-07-22 Version 4.5
New features
Input dependent noise and signal variance.
- Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
Sparse stochastic variational inference model.
- Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.
Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.
Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.
Improvements
- New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
- Precomputed demo results (including the figures) are now available in the folder tests/realValues.
-
New demos demonstrating new features etc.
- demo_epinf, demonstrating the input dependent noise and signal variance model
- demo_svi_regression, demo_svi_classification
- demo_modelcomparison2, demo_survival_comparison
Several minor bugfixes
July 22, 2014, 14:03:11 4.4 2014-04-11 Version 4.4
New features
Monotonicity constraint for the latent function.
- Riihimäki and Vehtari (2010). Gaussian processes with monotonicity information. Journal of Machine Learning Research: Workshop and Conference Proceedings, 9:645-652.
State space implementation for GP inference (1D) using Kalman filtering.
- For the following covariance functions: Squared-Exponential, Matérn-3/2 & 5/2, Exponential, Periodic, Constant
-
Särkkä, S., Solin, A., Hartikainen, J. (2013).
Spatiotemporal learning via infinite-dimensional Bayesian filtering and smoothing. IEEE Signal Processing Magazine, 30(4):51-61.
- Simo Sarkka (2013). Bayesian filtering and smoothing. Cambridge University Press.
- Solin, A. and Särkkä, S. (2014). Explicit link between periodic covariance functions and state space models. AISTATS 2014.
Improvements
- GP_PLOT function for quick plotting of GP predictions
- GP_IA now warns if it detects multimodal posterior distributions
- much faster EP with log-Gaussian likelihood (numerical integrals -> analytical results)
- faster WAIC with GP_IA array (numerical integrals -> analytical results)
-
New demos demonstrating new features etc.
- demo_minimal, minimal demo for regression and classification
- demo_kalman1, demo_kalman2
- demo_monotonic, demo_monotonic2
Plus bug fixes
April 15, 2014, 15:26:49 4.3.1 2013-11-26 Version 4.3.1
Improvements:
-
Updated cpsrf and psrf to follow BDA3: split each chain to two halves and use Geyer's IPSE for n_eff
- Multi-latent models for Octave
November 29, 2013, 14:10:59 4.3 2013-10-14 Version 4.3
Improvements:
- lgpdens.m: better default estimation using importance and rejection sampling, better default priors (see updated paper http://arxiv.org/abs/1211.0174)
- Robust-EP for zero truncated negative-binomial likelihood
- If moment computations in EP return NaN, return NaN energy (handled gracefully by fminlbfgs and fminscg)
- gp_cpred.m: new option 'target'
- gp_ia.m: Changed Hessian computation stepsize to 1e-3
- gpstuff_version.m: function for returning current GPstuff version
- gpia_jpreds.m: a new function
- demo_survival_weibull.m -> demo_survival_aft.m
Bug fixes:
- build suitesparse path correctly if it includes spaces
- gp_avpredcomp.m: fixed for Cox-PH
- gp_cpred.m: fixed for Cox-PH
- esls.m: don't accept a step to a point with infinite log likelihood
- gp_ia.m: removed some redundant computation
- gp_rnd.m: works now for multilatent models also
- bugfixes for setrandstream
- other bugfixes
October 16, 2013, 13:27:09 4.2 2013-06-14 Version 4.2
Improvements
- Cross-validation much faster if no bias-corrections are needed (computes only the necessary predictions)
- Marginal posterior corrections with loopred (Laplace) and cross-validation
- More robust computation of marginal posterior corrections
- More robust density estimation in lgpdens (default parameters changed)
Bug fixes
- Mex files now in correct folders if compiled with SuiteSparse (covariance matrix computation now much faster)
- Fixed bug with default marginal posterior correction when using gp_predcm
- Fixed conditions in likelihood functions for grid approximation of predictions with marginal posterior corrections
- Fixed outputs of gpmc_preds with multilatent models (thanks to Mahdi Biparva for pointing this out)
- and some minor bug fixes
June 14, 2013, 12:30:18 4.1 2013-04-24 Version 4.1
New features:
- Multinomial probit classification with nested-EP. Jaakko Riihimäki, Pasi Jylänki and Aki Vehtari (2013). Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood. Journal of Machine Learning Research 14:75-109, 2013.
-
Marginal posterior corrections for latent values. Cseke & Heskes
(2011). Approximate Marginals in Latent Gaussian Models. Journal of Machine Learning Research 12 (2011), 417-454
- Laplace: cm2 and fact
- EP: fact
Improvements
- lgpdens ignores now NaNs instead of giving error
- gp_cpred has a new option 'target' accpeting values 'f' or 'mu'
-
unified gp_waic and gp_dic
- by default return mlpd
- option 'form' accetps now values 'mean' 'all' 'sum' and 'dic'
-
improved survival demo demo_survival_aft (accalerated failure time)
- renamed and improved from demo_survival_weibull
- rearranged some files to more logical directories
- bug fixes
New files
- gp_predcm: marginal posterior corrections for latent values.
- demo_improvedmarginals: demonstration of marginal posterior corrections
- demo_improvedmarginals2: demonstration of marginal posterior corrections
- lik_multinomprobit: multinomial probit likelihood
- demo_multiclass_nested_ep: demonstration of nested EP with multinomprobit
April 25, 2013, 11:07:06 4.0 Initial Announcement on mloss.org.
March 22, 2013, 08:14:05
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.