20 projects found that use octave as the programming language.
Showing Items 1-20 of 33 on page 1 of 2: 1 2 Next

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.5

by hn - December 8, 2014, 13:54:38 CET [ Project Homepage BibTeX Download ] 20452 views, 4787 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • mechanism for specifying hyperparameter priors (together with Roman Garnett and José Vallet)
  • new inference method inf/infGrid allowing efficient inference for data defined on a Cartesian grid (together with Andrew Wilson)
  • new mean/cov functions for preference learning: meanPref/covPref
  • new mean/cov functions for non-vectorial data: meanDiscrete/covDiscrete
  • new piecewise constant nearest neighbor mean function: meanNN
  • new mean functions being predictions from GPs: meanGP and meanGPexact
  • new covariance function for standard additive noise: covEye
  • new covariance function for factor analysis: covSEfact
  • new covariance function with varying length scale : covSEvlen
  • make covScale more general to scaling with a function instead of a scalar
  • bugfix in covGabor* and covSM (due to Andrew Gordon Wilson)
  • bugfix in lik/likBeta.m (suggested by Dali Wei)
  • bugfix in solve_chol.c (due to Todd Small)
  • bugfix in FITC inference mode (due to Joris Mooij) where the wrong mode for post.L was chosen when using infFITC and post.L being a diagonal matrix
  • bugfix in infVB marginal likelihood for likLogistic with nonzero mean function (reported by James Lloyd)
  • removed the combination likErf/infVB as it yields a bad posterior approximation and lacks theoretical justification
  • Matlab and Octave compilation for L-BFGS-B v2.4 and the more recent L-BFGS-B v3.0 (contributed by José Vallet)
  • smaller bugfixes in gp.m (due to Joris Mooij and Ernst Kloppenburg)
  • bugfix in lik/likBeta.m (due to Dali Wei)
  • updated use of logphi in lik/likErf
  • bugfix in util/solve_chol.c where a typing issue occured on OS X (due to Todd Small)
  • bugfix due to Bjørn Sand Jensen noticing that cov_deriv_sq_dist.m was missing in the distribution
  • bugfix in infFITC_EP for ttau->inf (suggested by Ryan Turner)

Logo BayesOpt, a Bayesian Optimization toolbox 0.7.2

by rmcantin - October 10, 2014, 19:12:59 CET [ Project Homepage BibTeX Download ] 9780 views, 2004 downloads, 4 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Fixed bugs and doc typos


Logo Toeblitz Toolkit for Fast Toeplitz Matrix Operations 1.03

by cunningham - August 13, 2014, 02:21:36 CET [ BibTeX Download ] 2713 views, 691 downloads, 2 subscriptions

About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory.

Changes:

Adding a write-up in written/toeblitz.pdf describing the package.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16013 views, 3867 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo RankSVM NC 1.0

by rflamary - July 10, 2014, 15:51:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1031 views, 236 downloads, 1 subscription

About: This package is an implementation of a linear RankSVM solver with non-convex regularization.

Changes:

Initial Announcement on mloss.org.


Logo MIToolbox 2.1

by apocock - June 30, 2014, 01:05:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14332 views, 2685 downloads, 1 subscription

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Added weighted entropy functions. Fixed a few memory handling bugs.


Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 54585 views, 11466 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.


Logo The Choquet Kernel 1.00

by AliFall - February 11, 2014, 16:21:15 CET [ BibTeX BibTeX for corresponding Paper Download ] 933 views, 251 downloads, 1 subscription

About: The package computes the optimal parameters for the Choquet kernel

Changes:

Initial Announcement on mloss.org.


Logo LIBOL 0.3.0

by stevenhoi - December 12, 2013, 15:26:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8486 views, 2756 downloads, 2 subscriptions

About: LIBOL is an open-source library with a family of state-of-the-art online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification.

Changes:

In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows:

• Add a template and guide for adding new algorithms;

• Improve parameter settings and make documentation clear;

• Improve documentation on data formats and key functions;

• Amend the "OGD" function to use different loss types;

• Fixed some name inconsistency and other minor bugs.


About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo AlCoCoMa 1.0

by kiraly - November 8, 2013, 09:38:07 CET [ BibTeX BibTeX for corresponding Paper Download ] 1225 views, 298 downloads, 1 subscription

About: ALgebraic COmbinatorial COmpletion of MAtrices. A collection of algorithms to impute or denoise single entries in an incomplete rank one matrix, to determine for which entries this is possible with any algorithm, and to provide algorithm-independent error estimates. Includes demo scripts.

Changes:

Initial Announcement on mloss.org.


Logo GPgrid toolkit for fast GP analysis on grid input 0.1

by ejg20 - September 16, 2013, 18:01:16 CET [ BibTeX Download ] 993 views, 343 downloads, 1 subscription

About: GPgrid toolkit for fast GP analysis on grid input

Changes:

Initial Announcement on mloss.org.


About: Fast Multidimensional GP Inference using Projected Additive Approximation

Changes:

Initial Announcement on mloss.org.


About: Stochastic neighbor embedding originally aims at the reconstruction of given distance relations in a low-dimensional Euclidean space. This can be regarded as general approach to multi-dimensional scaling, but the reconstruction is based on the definition of input (and output) neighborhood probability alone. The present implementation also allows for handling dissimilarity or score-induced neighborhood topologies and makes use of quasi 2nd order gradient-based (l-)BFGS optimization.

Changes:
  • gradient in xsne_fun.m fixed! (constant factor m was missing)

  • symmetry option re-introduced allowing for enabling symmetric and asymmetric versions of SNE and t-SNE


Logo AROFAC 1.0

by kiraly - August 5, 2013, 10:56:21 CET [ BibTeX BibTeX for corresponding Paper Download ] 1376 views, 365 downloads, 1 subscription

About: Approximate Rank One FACtorization of tensors. An algorithm for factorization of three-way-tensors and determination of their rank, includes example applications.

Changes:

Initial Announcement on mloss.org.


Logo cbMDS Correlation Based Multi Dimensional Scaling 1.2

by emstrick - July 27, 2013, 14:35:36 CET [ BibTeX BibTeX for corresponding Paper Download ] 3475 views, 884 downloads, 1 subscription

About: The aim is to embed a given data relationship matrix into a low-dimensional Euclidean space such that the point distances / distance ranks correlate best with the original input relationships. Input relationships may be given as (sparse) (asymmetric) distance, dissimilarity, or (negative!) score matrices. Input-output relations are modeled as low-conditioned. (Weighted) Pearson and soft Spearman rank correlation, and unweighted soft Kendall correlation are supported correlation measures for input/output object neighborhood relationships.

Changes:
  • Initial release (Ver 1.0): Weighted Pearson and correlation and soft Spearman rank correlation, Tue Dec 4 16:14:51 CET 2012

  • Ver 1.1 Added soft Kendall correlation, Fri Mar 8 08:41:09 CET 2013

  • Ver 1.2 Added reconstruction of sparse relationship matrices, Fri Jul 26 16:58:37 CEST 2013


Logo JMLR libDAI 0.3.1

by jorism - September 17, 2012, 14:17:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 35763 views, 6648 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields.

Changes:

Release 0.3.1 fixes various bugs. The issues on 64-bit Windows platforms have been fixed and libDAI now offers full 64-bit support on all supported platforms (Linux, Mac OSX, Windows).


About: The package provides a Lagrangian approach to the posterior regularization of given linear mappings. This is important in two cases, (a) when systems are under-determined and (b) when the external model for calculating the mapping is invariant to properties such as scaling. The software may be applied in cases when the external model does not provide its own regularization strategy. In addition, the package allows to rank attributes according to their distortion potential to a given linear mapping.

Changes:

Version 1.1 (May 23, 2012) memory and time optimizations distderivrel.m now supports assessing the relevance of attribute pairs

Version 1.0 (Nov 9, 2011) * Initial Announcement on mloss.org.


Logo TMBP 1.0

by zengjia - April 5, 2012, 06:42:26 CET [ BibTeX BibTeX for corresponding Paper Download ] 4358 views, 2160 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: Message passing for topic modeling

Changes:
  1. improve "readme.pdf".
  2. correct some compilation errors.

About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 33 on page 1 of 2: 1 2 Next