Projects that are tagged with mutual information.


Logo JMLR Information Theoretical Estimators 0.62

by szzoli - April 17, 2016, 17:19:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 96696 views, 18841 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Von Mises expansion based estimators: included for 7 unconditional quantities (Shannon entropy, Shannon mutual information, Kullback-Leibler divergence, Rényi divergence, Tsallis divergence, Pearson Chi^2 divergence, Hellinger distance.

  • Analytical value (for Gaussian random variables) and quick test: added for the Hellinger distance.


Logo FEAST 1.1.4

by apocock - March 12, 2016, 18:35:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 32264 views, 6298 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:
  • Fixed an issue where zero MI values would cause it to segfault.
  • Fixes to documentation and comments.
  • Updated internal version of MIToolbox.

Logo MIToolbox 2.1.2

by apocock - January 10, 2016, 22:19:30 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 24539 views, 4314 downloads, 2 subscriptions

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Relicensed as BSD. Added checks to catch MATLAB inputs that aren't doubles.