20 projects found that use matlab as the programming language.
Showing Items 1-20 of 160 on page 1 of 8: 1 2 3 4 5 6 Next Last

Logo 1SpectralClustering 1.2

by tbuehler - May 1, 2018, 19:26:07 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 33421 views, 7320 downloads, 0 subscriptions

About: A fast and scalable graph-based clustering algorithm based on the eigenvectors of the nonlinear 1-Laplacian.

Changes:
  • improved optimization of ncut and rcut criterion
  • optimized eigenvector initialization
  • changed default values for number of runs
  • several internal optimizations
  • made console output more informative

Logo Somoclu 1.7.5

by peterwittek - March 1, 2018, 23:30:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 92942 views, 17034 downloads, 0 subscriptions

About: Somoclu is a massively parallel implementation of self-organizing maps. It relies on OpenMP for multicore execution, MPI for distributing the workload, and it can be accelerated by CUDA on a GPU cluster. A sparse kernel is also included, which is useful for training maps on vector spaces generated in text mining processes. Apart from a command line interface, Python, Julia, R, and MATLAB are supported.

Changes:
  • New: A Makefile for mingw to build on Windows.
  • Changed: PR #94 added a much more efficient sparse kernel.
  • Changed: boilerplate code for Julia greatly improved.
  • Changed: Code cleanup, pre-processor macros simplified.
  • Changed: Adapted to Seaborn API changes in plotting heatmaps.

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.1

by hn - November 27, 2017, 19:26:13 CET [ Project Homepage BibTeX Download ] 86312 views, 19234 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

Logdet-estimation functionality for grid-based approximate covariances

  • Lanczos subspace estimation

  • Chebyshef polynomial expansion

More generic infEP functionality

  • dense computations and sparse approximations using the same code

  • covering KL inference as a special cas of EP

New infKL function contributed by Emtiyaz Khan and Wu Lin

  • Conjugate-Computation Variational Inference algorithm

  • much more scalable than previous versions

Time-series covariance functions on the positive real line

  • covW (i-times integrated) Wiener process covariance

  • covOU (i-times integrated) Ornstein-Uhlenbeck process covariance (contributed by Juan Pablo Carbajal)

  • covULL underdamped linear Langevin process covariance (contributed by Robert MacKay)

  • covFBM Fractional Brownian motion covariance

New covariance functions

  • covWarp implements k(w(x),w(z)) where w is a "warping" function

  • covMatern has been extended to also accept non-integer distance parameters


Logo iLANN SVD. An incremental noniterative learning method for one layer feedforwar 1.0

by ofontenla - August 16, 2017, 11:53:40 CET [ BibTeX BibTeX for corresponding Paper Download ] 5463 views, 1496 downloads, 0 subscriptions

About: A non-iterative, incremental and hyperparameter-free learning method for one-layer feedforward neural networks without hidden layers. This method efficiently obtains the optimal parameters of the network, regardless of whether the data contains a greater number of samples than variables or vice versa. It does this by using a square loss function that measures errors before the output activation functions and scales them by the slope of these functions at each data point. The outcome is a system of linear equations that obtain the network's weights and that is further transformed using Singular Value Decomposition.

Changes:

Initial Announcement on mloss.org.


Logo LANN SVD. A noniterative SVD based learning algorithm for one layer neural nets 1.0

by ofontenla - August 7, 2017, 13:52:19 CET [ BibTeX BibTeX for corresponding Paper Download ] 5432 views, 1455 downloads, 0 subscriptions

About: A non-iterative learning method for one-layer (no hidden layer) neural networks, where the weights can be calculated in a closed-form manner, thereby avoiding low convergence rate and also hyperparameter tuning. The proposed learning method, LANN-SVD in short, presents a good computational efficiency for large-scale data analytic.

Changes:

Initial Announcement on mloss.org.


Logo pSpectralClustering 1.2

by tbuehler - July 30, 2017, 20:07:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 25084 views, 5007 downloads, 0 subscriptions

About: A generalized version of spectral clustering using the graph p-Laplacian.

Changes:

various internal optimizations


Logo Kernel Adaptive Filtering Toolbox 2.0

by steven2358 - May 22, 2017, 10:05:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 30838 views, 5809 downloads, 0 subscriptions

About: A Matlab benchmarking toolbox for online and adaptive regression with kernels.

Changes:
  • Changes in algorithms' Matlab class format
  • New algorithms
  • Minor improvements and bug fixes

Logo JMLR MSVMpack 1.5.1

by lauerfab - March 9, 2017, 12:29:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 62524 views, 15192 downloads, 0 subscriptions

About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini.

Changes:
  • Fix compilation error with recent gcc

Logo MIToolbox 3.0.1

by apocock - March 2, 2017, 00:38:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 64182 views, 10632 downloads, 0 subscriptions

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Fixed a Windows compilation bug. MIToolbox v3 should now compile using Visual Studio.


Logo Bagging PCA Hashing 1.0

by openpr_nlpr - February 6, 2017, 10:38:53 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7228 views, 1558 downloads, 0 subscriptions

About: The proposed hashing algorithm leverages the bootstrap sampling idea and integrates it with PCA, resulting in a new projection method called Bagging PCA Hashing.

Changes:

Initial Announcement on mloss.org.


Logo Online Sketching Hashing 1.0

by openpr_nlpr - February 6, 2017, 10:36:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6923 views, 1390 downloads, 0 subscriptions

About: This is an online hashing algorithm which can handle the stream data with low computational cost.

Changes:

Initial Announcement on mloss.org.


Logo NaN toolbox 3.1.2

by schloegl - January 22, 2017, 12:24:59 CET [ Project Homepage BibTeX Download ] 135044 views, 30957 downloads, 0 subscriptions

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.3.1.2 - improve configuration and build system - support of more platforms (including Octave 4.2.0) improved

Changes in v.3.0.3 - improve compatibility for Octave on Windows

Changes in v.3.0.1 - fix packaging for octave

Changes in v.2.8.5 - bug fix: trimmean - compiler support for gcc-5 and clang - fix typos

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo FEAST 2.0.0

by apocock - January 8, 2017, 00:49:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 74800 views, 12606 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:

Major refactoring of FEAST to improve speed and portability.

  • FEAST now clones the input data if it's floating point and discretises it to unsigned ints once in a single pass. This improves the speed by about 30%.
  • FEAST now has unsigned int entry points which avoid this discretisation and are much faster if the data is already categorical.
  • Added weighted feature selection algorithms to FEAST which can be used for cost-sensitive feature selection.
  • Added a Java API using JNI.
  • FEAST now returns the internal score for each feature according to the criterion. Available in all three APIs.
  • Rearranged the repository to make it easier to work with. Header files are now in `include`, source in `src`, the MATLAB API is in `matlab/` and the Java API is in `java/`.
  • FEAST now compiles cleanly using `-std=c89 -Wall -Werror`.

Logo slim for matlab 0.2

by ustunb - August 23, 2016, 20:27:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9359 views, 2293 downloads, 0 subscriptions

About: learn optimized scoring systems using MATLAB and the CPLEX Optimization Studio

Changes:

Initial Announcement on mloss.org.


Logo Social Impact theory based Optimizer library 1.1

by rishem - July 29, 2016, 13:19:47 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 26925 views, 5861 downloads, 0 subscriptions

About: This is an optimization library based on Social Impact Theory(SITO). The optimizer works in the same way as PSO and GA.

Changes:

bug removed


Logo DeeBNet, a new object oriented MATLAB toolbox for Deep Belief Networks 3.2

by keyvanrad - June 26, 2016, 16:19:55 CET [ Project Homepage BibTeX Download ] 35346 views, 8181 downloads, 0 subscriptions

About: Nowadays this is very popular to use the deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use a stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. DBNs have many abilities such as feature extraction and classification that are used in many applications including image processing, speech processing, text categorization, etc. This paper introduces a new object oriented toolbox with the most important abilities needed for the implementation of DBNs. According to the results of the experiments conducted on the MNIST (image), ISOLET (speech), and the 20 Newsgroups (text) datasets, it was shown that the toolbox can learn automatically a good representation of the input from unlabeled data with better discrimination between different classes. Also on all the aforementioned datasets, the obtained classification errors are comparable to those of the state of the art classifiers. In addition, the toolbox supports different sampling methods (e.g. Gibbs, CD, PCD and our new FEPCD method), different sparsity methods (quadratic, rate distortion and our new normal method), different RBM types (generative and discriminative), GPU based, etc. The toolbox is a user-friendly open source software in MATLAB and Octave and is freely available on the website.

Changes:

New in toolbox

  • Using GPU in Backpropagation
  • Revision of some demo scripts
  • Function approximation with multiple outputs
  • Feature extraction with GRBM in first layer

cardinal


Logo JMLR Information Theoretical Estimators 0.63

by szzoli - June 9, 2016, 23:42:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 315778 views, 59223 downloads, 0 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Conditional Shannon entropy estimation: added.

  • Conditional Shannon mutual information estimation: included.


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 89010 views, 20396 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo MDLText 1

by renatoms88 - March 3, 2016, 19:31:25 CET [ BibTeX Download ] 4427 views, 1604 downloads, 0 subscriptions

About: testing mloss.org

Changes:

Initial Announcement on mloss.org.


Logo Local high order regularization 1.0

by kkim - March 2, 2016, 13:46:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7232 views, 1596 downloads, 0 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Local high-order regularization for semi-supervised learning

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 160 on page 1 of 8: 1 2 3 4 5 6 Next Last