Projects supporting the octave data format.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10823 views, 2923 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 41368 views, 8940 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 17405 views, 4215 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory.

Changes:

Adding tar directly instead of via link


Logo NaN toolbox 2.5.2

by schloegl - February 10, 2012, 11:45:52 CET [ Project Homepage BibTeX Download ] 27754 views, 5648 downloads, 1 subscription

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.2.5.2 - faster version of quantile if multiple quantiles are requested - removes the dependency on ZLIB and thus - fixes "pkg install nan" for Octave on Windows - a number of minor improvements

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo mldata.org svn-r1070-Apr-2011

by sonne - April 8, 2011, 10:15:49 CET [ Project Homepage BibTeX Download ] 3591 views, 657 downloads, 1 subscription

About: The source code of the mldata.org site - a community portal for machine learning data sets.

Changes:

Initial Announcement on mloss.org.


Logo mldata-utils 0.5.0

by sonne - April 8, 2011, 10:02:44 CET [ Project Homepage BibTeX Download ] 18029 views, 3737 downloads, 1 subscription

About: Tools to convert datasets from various formats to various formats, performance measures and API functions to communicate with mldata.org

Changes:
  • Change task file format, such that data splits can have a variable number items and put into up to 256 categories of training/validation/test/not used/...
  • Various bugfixes.

Logo Hidden Markov Support Vector Machines 0.2

by pramod - April 16, 2010, 17:27:41 CET [ BibTeX Download ] 4540 views, 1165 downloads, 1 subscription

About: This software is an implementation of Hidden Markov Support Vector Machines (HMSVMs).

Changes:

Initial Announcement on mloss.org.


About: This toolbox provides functions for maximizing and minimizing submodular set functions, with applications to Bayesian experimental design, inference in Markov Random Fields, clustering and others.

Changes:
  • Modified specification of optional parameters (using sfo_opt)
  • Added sfo_ls_lazy for maximizing nonnegative submodular functions
  • Added sfo_fn_infogain, sfo_fn_lincomb, sfo_fn_invert, ...
  • Added additional documentation and more examples
  • Now Octave ready

Logo FWTN 1.0

by hn - March 25, 2010, 16:58:24 CET [ Project Homepage BibTeX Download ] 3675 views, 817 downloads, 1 subscription

About: Orthonormal wavelet transform for D dimensional tensors in L levels. Generic quadrature mirror filters and tensor sizes. Runtime is O(n), plain C, MEX-wrapper and demo provided.

Changes:

Initial Announcement on mloss.org.


Logo JMLR Error Correcting Output Codes Library 0.1

by sescalera - March 5, 2010, 16:49:12 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8210 views, 1041 downloads, 1 subscription

About: The open source Error-Correcting Output Codes (ECOC) library contains both state-of-the-art coding and decoding designs, as well as the option to include your own coding, decoding, and base classifier.

Changes:

Initial Announcement on mloss.org.