Project details for Information Theoretical Estimators

Screenshot Information Theoretical Estimators 0.42

by szzoli - September 7, 2013, 16:46:42 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view ( today), download ( today ), 0 subscriptions

Description:

ITE can estimate

  • entropy: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy,

  • mutual information: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information (total correlation, multi-information), L2 mutual information, Rényi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure,

  • divergence: Kullback-Leibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), J-distance (symmetrised Kullback-Leibler divergence, J divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence, energy distance (specially the Cramer-Von Mises distance), Jensen-Shannon divergence, Jensen-Rényi divergence, K divergence, L divergence, certain f-divergences (Csiszár-Morimoto divergence, Ali-Silvey distance), non-symmetric Bregman distance (Bregman divergence), Jensen-Tsallis divergence, symmetric Bregman distance,

  • association measures, including measures of concordance: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,

  • cross quantities: cross-entropy,

  • kernels on distributions: expected kernel, Bhattacharyya kernel, probability product kernel, Jensen-Shannon kernel.

ITE offers solution methods for

  • Independent Subspace Analysis (ISA) and
  • its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed models, as well as to systems with nonparametric source dynamics.

ITE is

  • written in Matlab/Octave,
  • multi-platform (tested extensively on Windows and Linux),
  • free and open source (released under the GNU GPLv3(>=) license).
Changes to previous version:
  • High-level information theoretical estimators: 'eval' changed to 'function handles' -- speeds up computations.

  • Cost object initialization: now allows setting field values (alpha, number of kNN-s, ...) through its argument. => possibility to override default values, automatic inheritence in meta estimators.

  • Quick tests introduced: consistency of the estimators (analytical vs. estimated value), positive semi-definiteness of Gram matrices determined by distribution kernels, image registration.

  • Refactorization; documentation: improved.

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Linux, Windows
Data Formats: Matlab, Octave
Tags: Entropy, Mutual Information, Divergence, Independent Subspace Analysis, Separation Principles, Independent Process Analysis, Association Measure, Measure Of Concordance, Measure Of Independence, Nonpa
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.