20 projects found that use octave as the programming language.
Showing Items 1-20 of 33 on page 1 of 2: 1 2 Next

Logo BayesOpt, a Bayesian Optimization toolbox 0.7.2

by rmcantin - October 10, 2014, 19:12:59 CET [ Project Homepage BibTeX Download ] 8521 views, 1759 downloads, 4 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Fixed bugs and doc typos


Logo Toeblitz Toolkit for Fast Toeplitz Matrix Operations 1.03

by cunningham - August 13, 2014, 02:21:36 CET [ BibTeX Download ] 2342 views, 595 downloads, 2 subscriptions

About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory.

Changes:

Adding a write-up in written/toeblitz.pdf describing the package.


Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14126 views, 3512 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo RankSVM NC 1.0

by rflamary - July 10, 2014, 15:51:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 798 views, 174 downloads, 1 subscription

About: This package is an implementation of a linear RankSVM solver with non-convex regularization.

Changes:

Initial Announcement on mloss.org.


Logo MIToolbox 2.1

by apocock - June 30, 2014, 01:05:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13336 views, 2517 downloads, 1 subscription

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Added weighted entropy functions. Fixed a few memory handling bugs.


Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 49440 views, 10491 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.


Logo The Choquet Kernel 1.00

by AliFall - February 11, 2014, 16:21:15 CET [ BibTeX BibTeX for corresponding Paper Download ] 830 views, 220 downloads, 1 subscription

About: The package computes the optimal parameters for the Choquet kernel

Changes:

Initial Announcement on mloss.org.


Logo LIBOL 0.3.0

by stevenhoi - December 12, 2013, 15:26:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7683 views, 2366 downloads, 2 subscriptions

About: LIBOL is an open-source library with a family of state-of-the-art online learning algorithms for machine learning and big data analytics research. The current version supports 16 online algorithms for binary classification and 13 online algorithms for multiclass classification.

Changes:

In contrast to our last version (V0.2.3), the new version (V0.3.0) has made some important changes as follows:

• Add a template and guide for adding new algorithms;

• Improve parameter settings and make documentation clear;

• Improve documentation on data formats and key functions;

• Amend the "OGD" function to use different loss types;

• Fixed some name inconsistency and other minor bugs.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 19203 views, 4546 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo AlCoCoMa 1.0

by kiraly - November 8, 2013, 09:38:07 CET [ BibTeX BibTeX for corresponding Paper Download ] 1084 views, 268 downloads, 1 subscription

About: ALgebraic COmbinatorial COmpletion of MAtrices. A collection of algorithms to impute or denoise single entries in an incomplete rank one matrix, to determine for which entries this is possible with any algorithm, and to provide algorithm-independent error estimates. Includes demo scripts.

Changes:

Initial Announcement on mloss.org.


Logo GPgrid toolkit for fast GP analysis on grid input 0.1

by ejg20 - September 16, 2013, 18:01:16 CET [ BibTeX Download ] 868 views, 302 downloads, 1 subscription

About: GPgrid toolkit for fast GP analysis on grid input

Changes:

Initial Announcement on mloss.org.


About: Fast Multidimensional GP Inference using Projected Additive Approximation

Changes:

Initial Announcement on mloss.org.


About: Stochastic neighbor embedding originally aims at the reconstruction of given distance relations in a low-dimensional Euclidean space. This can be regarded as general approach to multi-dimensional scaling, but the reconstruction is based on the definition of input (and output) neighborhood probability alone. The present implementation also allows for handling dissimilarity or score-induced neighborhood topologies and makes use of quasi 2nd order gradient-based (l-)BFGS optimization.

Changes:
  • gradient in xsne_fun.m fixed! (constant factor m was missing)

  • symmetry option re-introduced allowing for enabling symmetric and asymmetric versions of SNE and t-SNE


Logo AROFAC 1.0

by kiraly - August 5, 2013, 10:56:21 CET [ BibTeX BibTeX for corresponding Paper Download ] 1249 views, 339 downloads, 1 subscription

About: Approximate Rank One FACtorization of tensors. An algorithm for factorization of three-way-tensors and determination of their rank, includes example applications.

Changes:

Initial Announcement on mloss.org.


Logo cbMDS Correlation Based Multi Dimensional Scaling 1.2

by emstrick - July 27, 2013, 14:35:36 CET [ BibTeX BibTeX for corresponding Paper Download ] 3181 views, 834 downloads, 1 subscription

About: The aim is to embed a given data relationship matrix into a low-dimensional Euclidean space such that the point distances / distance ranks correlate best with the original input relationships. Input relationships may be given as (sparse) (asymmetric) distance, dissimilarity, or (negative!) score matrices. Input-output relations are modeled as low-conditioned. (Weighted) Pearson and soft Spearman rank correlation, and unweighted soft Kendall correlation are supported correlation measures for input/output object neighborhood relationships.

Changes:
  • Initial release (Ver 1.0): Weighted Pearson and correlation and soft Spearman rank correlation, Tue Dec 4 16:14:51 CET 2012

  • Ver 1.1 Added soft Kendall correlation, Fri Mar 8 08:41:09 CET 2013

  • Ver 1.2 Added reconstruction of sparse relationship matrices, Fri Jul 26 16:58:37 CEST 2013


Logo JMLR libDAI 0.3.1

by jorism - September 17, 2012, 14:17:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 34570 views, 6458 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: libDAI provides free & open source implementations of various (approximate) inference methods for graphical models with discrete variables, including Bayesian networks and Markov Random Fields.

Changes:

Release 0.3.1 fixes various bugs. The issues on 64-bit Windows platforms have been fixed and libDAI now offers full 64-bit support on all supported platforms (Linux, Mac OSX, Windows).


About: The package provides a Lagrangian approach to the posterior regularization of given linear mappings. This is important in two cases, (a) when systems are under-determined and (b) when the external model for calculating the mapping is invariant to properties such as scaling. The software may be applied in cases when the external model does not provide its own regularization strategy. In addition, the package allows to rank attributes according to their distortion potential to a given linear mapping.

Changes:

Version 1.1 (May 23, 2012) memory and time optimizations distderivrel.m now supports assessing the relevance of attribute pairs

Version 1.0 (Nov 9, 2011) * Initial Announcement on mloss.org.


Logo TMBP 1.0

by zengjia - April 5, 2012, 06:42:26 CET [ BibTeX BibTeX for corresponding Paper Download ] 4081 views, 2000 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: Message passing for topic modeling

Changes:
  1. improve "readme.pdf".
  2. correct some compilation errors.

About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 33 on page 1 of 2: 1 2 Next