Project details for Information Theoretical Estimators

Screenshot JMLR Information Theoretical Estimators 0.63

by szzoli - June 9, 2016, 23:42:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (41 today), download ( 3 today ), 0 subscriptions

Description:

ITE can estimate

  • entropy: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy, Phi-entropy (f-entropy), Sharma-Mittal entropy,

  • mutual information: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information (total correlation, multi-information), L2 mutual information, Rényi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure, chi-square mutual information (Hilbert-Schmidt norm of the normalized cross-covariance operator, squared-loss mutual information, mean square contingency),

  • divergence: Kullback-Leibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), J-distance (symmetrised Kullback-Leibler divergence, J divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence, energy distance (specially the Cramer-Von Mises distance), Jensen-Shannon divergence, Jensen-Rényi divergence, K divergence, L divergence, f-divergence (Csiszár-Morimoto divergence, Ali-Silvey distance), non-symmetric Bregman distance (Bregman divergence), Jensen-Tsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance), Sharma-Mittal divergence,

  • association measures, including measures of concordance: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho,

  • cross quantities: cross-entropy,

  • kernels on distributions: expected kernel (summation kernel, mean map kernel, set kernel, multi-instance kernel, ensemble kernel; special convolution kernel), Bhattacharyya kernel (Bhattacharyya coefficient, Hellinger affinity), probability product kernel, Jensen-Shannon kernel, exponentiated Jensen-Shannon kernel, Jensen-Tsallis kernel, exponentiated Jensen-Renyi kernel(s), exponentiated Jensen-Tsallis kernel(s).

  • conditional entropy: conditional Shannon entropy.

  • conditional mutual information: conditional Shannon mutual information.

ITE offers

  • solvers for (i) Independent Subspace Analysis (ISA) and (ii) its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed models, as well as to systems with nonparametric source dynamics,

  • several consistency tests (analytical vs estimated value),

  • illustrations for information theoretical image registration, and

  • distribution regression with applications in (i) supervised entropy learning and (ii) aerosol optical depth prediction based on satellite images.

ITE is

  • written in Matlab/Octave,
  • multi-platform (tested extensively on Windows and Linux),
  • free and open source (released under the GNU GPLv3(>=) license).

Note (new): Python implementation (https://bitbucket.org/szzoli/ite-in-python/).

ITE mailing list: "https://groups.google.com/d/forum/itetoolbox".

Follow ITE:

  • on BitBucket: Python (https://bitbucket.org/szzoli/ite-in-python/follow), Matlab/Octave (https://bitbucket.org/szzoli/ite/follow),
  • on Twitter (https://twitter.com/ITEtoolbox).

Share your ITE application: "https://bitbucket.org/szzoli/ite/wiki".

Changes to previous version:
  • Conditional Shannon entropy estimation: added.

  • Conditional Shannon mutual information estimation: included.

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
Supported Operating Systems: Linux, Windows
Data Formats: Matlab, Octave
Tags: Conditional Entropy, Conditional Mutual Information, Entropy, Mutual Information, Divergence, Independent Subspace Analysis, Separation Principles, Independent Process Analysis, Association Measure, Measure Of Concordance, Measure Of Independence, Nonparametric Estimation, Distribution Regression
Archive: download here

Other available revisons

Version Changelog Date
0.63
  • Conditional Shannon entropy estimation: added.

  • Conditional Shannon mutual information estimation: included.

June 9, 2016, 23:42:14
0.62
  • Von Mises expansion based estimators: included for 7 unconditional quantities (Shannon entropy, Shannon mutual information, Kullback-Leibler divergence, Rényi divergence, Tsallis divergence, Pearson Chi^2 divergence, Hellinger distance.

  • Analytical value (for Gaussian random variables) and quick test: added for the Hellinger distance.

April 17, 2016, 17:19:00
0.61
  • Explicit additive constant computation in generalized kNN based Renyi entropy estimators: enhancement suggestion has been added.

  • Analytical value computation of the exponentiated Jensen-Renyi kernel-2: simplified.

February 8, 2015, 14:04:27
0.60
  • Quick test on the Tsallis divergence: introduced.

  • Pearson chi square divergence estimation in the exponential family (MLE + analytical formula): added.

June 3, 2014, 00:17:33
0.59
  • Adaptive partitioning based Shannon mutual information estimators: added.

  • Quick tests: updated to handle the new estimators.

May 16, 2014, 22:13:48
0.58
  • 3-way interaction indices based on the embedding of the (i) Lancaster interaction and (ii) 'joint - product of the marginals' signed measures to a RKHS: added.

  • Quick tests: updated to cover the new estimators.

April 29, 2014, 22:08:50
0.57
  • Kullback-Leibler divergence estimation based on maximum likelihood estimation + analytical formula in the chosen exponential family: added.

  • A new sampling based entropy estimator with KDE correction on the left/right sides: added.

  • Quick tests: updated with the new estimators.

April 10, 2014, 18:35:22
0.56
  • Distribution regression (supervised entropy learning, aerosol optical depth prediction based on satellite images): added.

  • MMD distance computation based on U-statistics, expected kernel: upgraded to cover new kernels (exponential, Cauchy, Matern, polynomial, rational quadratic, inverse multiquadratic).

March 27, 2014, 22:06:54
0.55
  • Shannon entropy and cross-entropy estimation based on maximum likelihood estimation + analytical formula in the chosen exponential family: added.

  • Quick tests: updated with the new estimators.

March 8, 2014, 00:18:00
0.54
  • Renyi and Tsallis entropy estimation based on maximum likelihood estimation + analytical formula in the exponential family: added.

  • Quick tests: updated according to the new estimators.

February 24, 2014, 18:28:32
0.53
  • f-divergence estimation based on second-order Taylor expansion + Pearson chi square divergence: added.

  • Shannon mutual information estimation based on KL divergence: added.

  • Quick tests: updated with the new estimators.

February 2, 2014, 14:04:19
0.52
  • Sharma-Mittal divergence estimation: added using (i) maximum likelihood estimation + analytical formula in the exponential family, (ii) k-nearest neighbors.

  • Quick test for (i) Sharma-Mittal divergence, (ii) Shannon mutual information: added.

  • Normal variables: added to the Pearson chi square divergence quick test.

January 10, 2014, 00:02:47
0.51
  • ITE has been accepted for publication in JMLR; citing information: added.

  • Block-MMD (maximum mean discrepancy) estimator: added.

  • 'Extreme large' k in kNN based estimators: an overflow issue discovered + corrected.

  • +Some refactorization, documentation upgrade.

December 29, 2013, 17:36:00
0.50
  • Entropy and Kullback-Leibler divergence estimation based on power spectral density representation and Szegő's theorem: added.

  • Different noisy examples have been added to the image registration quick test.

December 18, 2013, 20:53:28
0.49
  • MMD (maximum mean discrepancy) estimation based on U- and V-statistics: incomplete Cholesky decomposition based accelerations added.

  • Refactorization; improved navigation in the documentation.

December 1, 2013, 16:44:15
0.48

Sharma-Mittal entropy estimation based on

  • k-nearest neighbors (S={k}): added.

  • maximum likehood estimation + analytical value in the exponential family: added.

November 11, 2013, 19:26:39
0.47
  • Chi-square mutual information estimation based on Pearson chi-square divergence: added.

  • Shannon entropy estimation based on an alternative linearly corrected spacing method: added.

November 1, 2013, 19:29:12
0.46
  • Phi-entropy (f-entropy) estimation based on the spacing method: added.

  • Pearson chi square divergence (chi square distance) estimation based on k-nearest neighbors: added.

October 21, 2013, 19:31:52
0.45
  • Exponentiated Jensen-Tsallis kernel-1 estimation based on Tsallis entropy: added.

  • Exponentiated Jensen-Tsallis kernel-2 estimation based on Jensen-Tsallis divergence: added.

October 9, 2013, 21:44:21
0.44
  • Exponentiated Jensen-Renyi kernel-1 estimation based on Renyi entropy: added.

  • Exponentiated Jensen-Renyi kernel-2 estimation based on Jensen-Renyi divergence: added.

October 1, 2013, 18:16:38
0.43
  • Exponentiated Jensen-Shannon kernel estimation: added.

  • Jensen-Tsallis kernel estimation: added.

September 20, 2013, 19:56:45
0.42
  • High-level information theoretical estimators: 'eval' changed to 'function handles' -- speeds up computations.

  • Cost object initialization: now allows setting field values (alpha, number of kNN-s, ...) through its argument. => possibility to override default values, automatic inheritence in meta estimators.

  • Quick tests introduced: consistency of the estimators (analytical vs. estimated value), positive semi-definiteness of Gram matrices determined by distribution kernels, image registration.

  • Refactorization; documentation: improved.

September 7, 2013, 16:46:42
0.41
  • Probability product kernel estimation based on k-nearest neighbors: added,

  • Jensen-Shannon kernel estimation: added.

July 12, 2013, 21:33:42
0.40
  • Bhattacharyya kernel estimation based on k-nearest neighbors: added,

  • Expected kernel estimation: added,

  • Kernel on distributions (K) object type: added.

June 23, 2013, 13:13:27
0.39
  • Symmetric Bregman distance estimation based on nonsymmetric Bregman distance: added,

  • Symmetric Bregman distance estimation using the k-nearest neighbor method: added.

June 12, 2013, 13:12:52
0.38
  • Jensen-Tsallis divergence estimation: added,

  • Bregman distance estimation: added.

June 1, 2013, 10:20:00
0.37
  • K divergence estimation: added,

  • L divergence estimation: added,

  • kNN squared distance computation: refined.

May 12, 2013, 15:35:39
0.36
  • Jensen-Rényi divergence estimation: added,

  • Jensen-Shannon divergence estimation: added.

April 26, 2013, 18:45:27
0.35

An alternative Jacobi optimization based ICA solution with general entropy/mutual information estimators: added; The method extends the RADICAL ICA scheme to general objectives.

April 2, 2013, 10:37:51
0.34

Jacobi optimization based ICA solution with general entropy/mutual information estimators: added. The method extends the SWICA scheme to general objectives.

March 22, 2013, 11:41:05
0.33

Two one-dimensional Shannon entropy estimators based on the maximum entropy method: added.

March 6, 2013, 09:44:47
0.32
  • ICA and ISA structures: introduced for unified treatment of the estimators. It will also enable embedding of general ICA optimization algorithms such as the Jacobi method.

  • 'stepwiseLS' mAR estimator: deleted.

  • 'kdpee.c': MSVC does not provide log2. A more elegant solution: added.

February 25, 2013, 12:42:40
0.31
  • EASI (equivariant adaptive separation via independence) real/complex ICA method: added.

  • Adaptive (k-d) partitioning based Shannon entropy estimation: added.

February 9, 2013, 11:31:39
0.30
  • Upper tail dependence via conditional Spearman's rho: added.

  • Multivariate conditional version of Spearman's rho weighting the lower tail: added.

January 25, 2013, 15:11:25
0.29
  • Lower tail dependence via conditional Spearman's rho: added.

  • Multivariate conditional version of Spearman's rho weighting the lower tail: added.

January 13, 2013, 11:38:17
0.28
  • Multivariate extension of Blomqvist's beta (medial correlation coefficient): added.

  • Average pairwise Spearman's rho: added.

January 2, 2013, 22:51:22
0.27
  • Approximate correntropy independence measure estimator: added.

  • Correntropy induced metric, centered correntropy induced metric estimators: added.

  • Correntropy, centered correntropy, correntropy coefficient estimators: added.

  • Handling of identically constant random variables in distance correlation computation: included.

December 28, 2012, 14:41:57
0.26
  • Distance covariance estimation via HSIC (Hilbert-Schmidt independence criterion): added.

  • Energy distance estimation via MMD (maximum mean discrepancy): added.

  • Energy distance estimation: added.

  • We computed the square of distance correlation: sqrt included.

December 22, 2012, 14:12:33
0.25

Distance covariance, distance correlation estimation: added.

December 15, 2012, 10:20:06
0.24

MMD (maximum mean discrepancy) estimation based on U- and V-statistics: added.

December 12, 2012, 12:16:05
0.23
  • Three multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient): added.

  • Association (A) cost object type: added.

December 7, 2012, 18:33:26
0.22
  • Cauchy-Schwartz and Euclidean distance based divergence estimators: added.

  • Cauchy-Schwartz and Euclidean distance based quadratic mutual information estimators: added.

December 1, 2012, 13:33:31
0.21
  • Kullback-Leibler divergence estimator based on cross-entropy and entropy: added.

  • Cross-entropy estimation based on k-nearest neighbors: added.

  • Cross cost object type: added.

November 25, 2012, 20:56:32
0.20
  • Two Shannon entropy estimators based on the distance (KL divergence) from the uniform/Gaussian distributions: added.

  • Shannon entropy estimator based on Voronoi regions: added.

November 21, 2012, 13:55:09
0.19
  • Two k-nearest neighbor based Kullback-Leibler divergence estimators: added.

  • compute_CDSS.cpp: 'sqrt(T)' -> 'sqrt(double(T))', to increase compatibility with compilers.

November 21, 2012, 13:47:05
0.18

8 sample spacing based 1d Shannon/Rényi entropy estimators: added.

November 10, 2012, 12:53:37
0.17
  • Edgeworth expansion based Shannon entropy estimator: accelerated (C++ alternative).
  • 'Tsallis entropy <- Renyi entropy' meta estimator: added.
November 6, 2012, 22:18:43
0.16
  • Edgeworth expansion based Shannon entropy estimator: added.
  • Lookup table for the underlying H/I/D estimation formulas: added (documentation).
November 2, 2012, 16:03:25
0.15
  • The Hellinger and Bhattacharyya distances are now available in ITE. They can be estimated via k-nearest neighbor methods.

  • A '/'->'*' typo: corrected in 'DL2_kNN_k_estimation.m'.

October 29, 2012, 17:27:28
0.14
  • Monte-Carlo simulation to compute the additive constants in Rényi entropy estimation: added.
  • some accelerations/compatibility enhancements: performed.
October 29, 2012, 12:31:34
0.13
  • Tsallis entropy is now available in ITE. It can be estimated via k-nearest neighbors.
  • A '/'->'*' typo: corrected in 'HRenyi_kNN_k_estimation.m'.
October 27, 2012, 22:14:18
0.12
  • Schweizer-Wolff's sigma and kappa: added.
  • Hoeffding's Phi computation: scaled-up.
October 27, 2012, 21:21:27
0.11

multivariate version of Hoeffding's Phi: added.

October 20, 2012, 23:30:17
0.1

Initial Announcement on mloss.org.

October 11, 2012, 07:47:43

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.