Project details for revrand

Logo revrand 1.0.0

by dsteinberg - January 29, 2017, 04:33:54 CET [ Project Homepage BibTeX Download ]

view (7 today), download ( 8 today ), 0 subscriptions

OverallEmpty StarEmpty StarEmpty StarEmpty StarEmpty Star
FeaturesEmpty StarEmpty StarEmpty StarEmpty StarEmpty Star
UsabilityEmpty StarEmpty StarEmpty StarEmpty StarEmpty Star
DocumentationEmpty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)
Description:

This library implements various Bayesian linear models (Bayesian linear regression) and generalized linear models. A few features of this library are:

  • A fancy basis functions/feature composition framework for combining basis functions like radial basis function, sigmoidal basis functions, polynomial basis functions etc.

  • Basis functions that can be used to approximate Gaussian processes with shift invariant covariance functions (e.g. square exponential) when used with linear models [1], [2], [3].

  • Non-Gaussian likelihoods with Bayesian generalized linear models (GLMs). We infer all of the parameters in the GLMs using stochastic variational inference [4], and we approximate the posterior over the weights with a mixture of Gaussians, like [5].

  • Large scale learning using stochastic gradient descent (Adam, AdaDelta and more).

  • Scikit Learn compatibility, i.e. usable with pipelines.

  • A host of decorators for scipy.optimize.minimize and stochastic gradients that enhance the functionality of these optimisers.

[1] Yang, Z., Smola, A. J., Song, L., & Wilson, A. G. "A la Carte -- Learning Fast Kernels". Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, pp. 1098-1106, 2015.

[2] Le, Q., Sarlos, T., & Smola, A. "Fastfood-approximating kernel expansions in loglinear time." Proceedings of the international conference on machine learning. 2013.

[3] Rahimi, A., & Recht, B. "Random features for large-scale kernel machines." Advances in neural information processing systems. 2007.

[4] Kingma, D. P., & Welling, M. "Auto-encoding variational Bayes". Proceedings of the 2nd International Conference on Learning Representations (ICLR). 2014.

[5] Gershman, S., Hoffman, M., & Blei, D. "Nonparametric variational inference". Proceedings of the international conference on machine learning. 2012.

Changes to previous version:
  • 1.0 release!
  • Now there is a random search phase before optimization of all hyperparameters in the regression algorithms. This improves the performance of revrand since local optima are more easily avoided with this improved initialisation
  • Regression regularizers (weight variances) associated with each basis object, this approximates GP kernel addition more closely
  • Random state can be set for all random objects
  • Numerous small improvements to make revrand production ready
  • Final report
  • Documentation improvements
BibTeX Entry: Download
Supported Operating Systems: Platform Independent
Data Formats: Numpy
Tags: Stochastic Gradient Descent, Large Scale Learning, Nonparametric Bayes, Nonlinear Regression, Gaussian Processes, Generalized Linear Models, Spark, Fast Food, Random Features
Archive: download here

Other available revisons

Version Changelog Date
1.0.0
  • 1.0 release!
  • Now there is a random search phase before optimization of all hyperparameters in the regression algorithms. This improves the performance of revrand since local optima are more easily avoided with this improved initialisation
  • Regression regularizers (weight variances) associated with each basis object, this approximates GP kernel addition more closely
  • Random state can be set for all random objects
  • Numerous small improvements to make revrand production ready
  • Final report
  • Documentation improvements
January 29, 2017, 04:33:54
0.9.0
  • Now there is a random search phase before optimization of all hyperparameters in the regression algorithms. This improves the performance of revrand since local optima are more easily avoided with this improved initialisation.

  • Documentation improvements

  • Simplification of GLM codebase

October 31, 2016, 06:02:46
0.7.0
  • Ability to set the random state in all random basis functions, optimisers and the generalised linear model
  • Numerous numerical bug fixes
  • small performance optimisations
October 14, 2016, 08:31:02
0.6.0
  • The GLM now uses Auto-encoding variational Bayes for inference as opposed to nonparametric variational inference. This substantially improves performance and simplifies the codebase.
  • Many bugfixes.
August 8, 2016, 08:39:08
0.5.0
  • Main interfaces to algorithms now follow the scikit learn standard.
  • Documentation improved.
  • Codebase dramatically simplified.
  • Per-datum arguments allowed in GLM.
July 26, 2016, 12:19:24
0.4.1
  • Allow for non-learnable likelihood arguments (per datum) in the glm
  • Hotfix for glm prediction sampling functions
June 24, 2016, 05:58:06
0.4
  • Implemented Gaussian spectral mixtures basis functions (from [1])
  • All relevant basis functions have ARD (automatic relevance determination)
  • Better matrix solves for positive definite matrices in slm module (try cholesky, if failed or numerically unstable, use truncated SVD)
  • Compatibility with Scikit Learn pipelines, and an optional Scikit Learn-style interface
  • More demos in notebooks
June 14, 2016, 06:39:25
0.3.2
  • Compatibility with Scikit Learn pipelines, and an optional Scikit Learn-style interface
  • Changed 'regression' module to 'slm' module for consistency with 'glm' module
May 31, 2016, 09:57:37
0.3
  • Simplification of all of the algorithm interfaces by using Parameter (bounded) types
  • Re-factoring of the modules in the library to make it more user friendly
April 29, 2016, 07:31:27
0.2
  • Partial application of basis functions
  • Bayesian GLM fully implemented with SGD
  • library refactor and removal of functionality that is also in sklearn
  • Many bug fixes and much more thorough unit testing
April 9, 2016, 08:18:26
0.1

Initial Announcement on mloss.org.

December 16, 2015, 02:58:08

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.