Projects that are tagged with nonparametric bayes.


Logo revrand 0.3

by dsteinberg - April 29, 2016, 07:31:27 CET [ Project Homepage BibTeX Download ] 1736 views, 383 downloads, 3 subscriptions

About: A library of scalable Bayesian generalised linear models with fancy features

Changes:
  • Simplification of all of the algorithm interfaces by using Parameter (bounded) types
  • Re-factoring of the modules in the library to make it more user friendly

Logo hca 0.63

by wbuntine - April 26, 2016, 15:35:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15901 views, 2565 downloads, 4 subscriptions

About: Multi-core non-parametric and bursty topic models (HDP-LDA, DCMLDA, and other variants of LDA) implemented in C using efficient Gibbs sampling, with hyperparameter sampling and other flexible controls.

Changes:

Corrected the new normalised Gamma model for topics so it works with multicore. Improvements to documentation. Added an asymptotic version of the generalised Stirling numbers so it longer fails when they run out of bounds on bigger data.


Logo JMLR GPstuff 4.6

by avehtari - July 15, 2015, 15:08:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 31996 views, 7699 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2015-07-09 Version 4.6

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Use Pareto smoothed importance sampling (Vehtari & Gelman, 2015) for

  • importance sampling leave-one-out cross-validation (gpmc_loopred.m)

  • importance sampling integration over hyperparameters (gp_ia.m)

  • importance sampling part of the logistic Gaussian process density estimation (lgpdens.m)

  • references:

    • Aki Vehtari and Andrew Gelman (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646.
    • Aki Vehtari, Andrew Gelman and Jonah Gabry (2015). Efficient implementation of leave-one-out cross-validation and WAIC for evaluating fitted Bayesian models.
  • New covariance functions

    • gpcf_additive creates a mixture over products of kernels for each dimension reference: Duvenaud, D. K., Nickisch, H., & Rasmussen, C. E. (2011). Additive Gaussian processes. In Advances in neural information processing systems, pp. 226-234.
    • gpcf_linearLogistic corresponds to logistic mean function
    • gpcf_linearMichelismenten correpsonds Michelis Menten mean function

Improvements - faster EP moment calculation for lik_logit

Several minor bugfixes


Logo linearizedGP 1.0

by dsteinberg - November 28, 2014, 07:02:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1960 views, 441 downloads, 1 subscription

About: Gaussian processes with general nonlinear likelihoods using the unscented transform or Taylor series linearisation.

Changes:

Initial Announcement on mloss.org.


Logo Nonparametric Sparse Factor Analysis 1

by davidknowles - July 26, 2013, 01:02:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2906 views, 667 downloads, 1 subscription

About: This is the core MCMC sampler for the nonparametric sparse factor analysis model presented in David A. Knowles and Zoubin Ghahramani (2011). Nonparametric Bayesian Sparse Factor Models with application to Gene Expression modelling. Annals of Applied Statistics

Changes:

Initial Announcement on mloss.org.


Logo The Infinite Hidden Markov Model 0.5

by jvangael - July 21, 2010, 23:41:24 CET [ BibTeX BibTeX for corresponding Paper Download ] 18975 views, 3324 downloads, 1 subscription

About: An implementation of the infinite hidden Markov model.

Changes:

Since 0.4: Removed dependency from lightspeed (now using statistics toolbox). Updated to newer matlab versions.


About: Matlab code for performing variational inference in the Indian Buffet Process with a linear-Gaussian likelihood model.

Changes:

Initial Announcement on mloss.org.