Projects that are tagged with mutual information.


Logo FEAST 1.1.1

by apocock - June 30, 2014, 01:30:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14837 views, 3475 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:
  • Bug fixes to memory management.
  • Compatibility changes for PyFeast python wrapper (note the C library now returns feature indices starting from 0, the Matlab wrapper still returns indices starting from 1).
  • Added C version of MIM.
  • Updated internal version of MIToolbox.

Logo MIToolbox 2.1

by apocock - June 30, 2014, 01:05:57 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12682 views, 2420 downloads, 1 subscription

About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features.

Changes:

Added weighted entropy functions. Fixed a few memory handling bugs.


Logo JMLR Information Theoretical Estimators 0.60

by szzoli - June 3, 2014, 00:17:33 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 45979 views, 9895 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.