About: A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation Changes:Release 0.7.0
|
About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods. Changes:2016-06-09 Version 4.7 Development and release branches available at https://github.com/gpstuff-dev/gpstuff New features
Improvements
Bugfixes
|
About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python. Changes:-Fixed bug in save/restore. -Fixed bug in initial design.
|
About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation. Changes:Initial Announcement on mloss.org.
|
About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory. Changes:Adding a write-up in written/toeblitz.pdf describing the package.
|
About: Gaussian process RTS smoothing (forward-backward smoothing) based on moment matching. Changes:Initial Announcement on mloss.org.
|
About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software. Changes:Initial Announcement on mloss.org.
|