Projects that are tagged with gaussian process.


Logo Aboleth 0.7

by dsteinberg - December 14, 2017, 02:39:19 CET [ Project Homepage BibTeX Download ] 14362 views, 3270 downloads, 0 subscriptions

About: A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation

Changes:

Release 0.7.0

  • Update to TensorFlow r1.4.

  • Tutorials in the documentation on:

  • Interfacing with Keras

  • Saving/loading models

  • How to build a variety of regressors with Aboleth

  • New prediction module with some convenience functions, including freezing the weight samples during prediction.

  • Bayesian convolutional layers with accompanying demo.

  • Allow the number of samples drawn from a model to be varied by using placeholders.

  • Generalise the feature embedding layers to work on matrix inputs (instead of just column vectors).

  • Numerous numerical and usability fixes.


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 81233 views, 18591 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo BayesOpt, a Bayesian Optimization toolbox 0.8.2

by rmcantin - December 9, 2015, 04:53:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 63206 views, 12073 downloads, 0 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Fixed bug in save/restore. -Fixed bug in initial design.


Logo linearizedGP 1.0

by dsteinberg - November 28, 2014, 07:02:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7780 views, 1580 downloads, 0 subscriptions

About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation.

Changes:

Initial Announcement on mloss.org.


Logo Toeblitz Toolkit for Fast Toeplitz Matrix Operations 1.03

by cunningham - August 13, 2014, 02:21:36 CET [ BibTeX Download ] 16418 views, 4436 downloads, 0 subscriptions

About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory.

Changes:

Adding a write-up in written/toeblitz.pdf describing the package.


Logo GP RTSS 1.0

by marc - March 21, 2012, 08:43:52 CET [ BibTeX BibTeX for corresponding Paper Download ] 8379 views, 2441 downloads, 0 subscriptions

About: Gaussian process RTS smoothing (forward-backward smoothing) based on moment matching.

Changes:

Initial Announcement on mloss.org.


Logo A Local and Parallel Computation Toolbox for Gaussian Process Regression 1.0

by cwpark - March 19, 2012, 17:21:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 19253 views, 5670 downloads, 0 subscriptions

About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.