Project details for Information Theoretical Estimators

Screenshot Information Theoretical Estimators 0.14

by szzoli - October 29, 2012, 12:31:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view ( today), download ( today ), 0 subscriptions

Description:

ITE can estimate Shannon-, Rényi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, Rényi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi; complex variants of entropy and mutual information; L2-, Rényi-, Tsallis divergence, maximum mean discrepancy, and J-distance.

ITE offers solution methods for

  • Independent Subspace Analysis (ISA) and
  • its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed models, as well as to systems with nonparametric source dynamics.

ITE is

  • written in Matlab/Octave,
  • multi-platform (tested extensively on Windows and Linux),
  • free and open source (released under the GNU GPLv3(>=) license).
Changes to previous version:
  • Monte-Carlo simulation to compute the additive constants in Rényi entropy estimation: added.
  • some accelerations/compatibility enhancements: performed.
BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Linux, Windows
Data Formats: Matlab, Octave
Tags: Entropy, Mutual Information, Divergence, Independent Subspace Analysis, Separation Principles, Independent Process Analysis
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.