Information Theoretical Estimatorshttp://mloss.orgUpdates and additions to Information Theoretical EstimatorsenThu, 09 Jun 2016 23:42:14 -0000Information Theoretical Estimators 0.63<html><p>ITE can estimate </p> <ul> <li><p><code>entropy</code>: Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), complex entropy, Phi-entropy (f-entropy), Sharma-Mittal entropy, </p> </li> <li><p><code>mutual information</code>: generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon mutual information (total correlation, multi-information), L2 mutual information, Rényi mutual information, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa, complex mutual information, Cauchy-Schwartz quadratic mutual information, Euclidean distance based quadratic mutual information, distance covariance, distance correlation, approximate correntropy independence measure, chi-square mutual information (Hilbert-Schmidt norm of the normalized cross-covariance operator, squared-loss mutual information, mean square contingency), </p> </li> <li><p><code>divergence</code>: Kullback-Leibler divergence (relative entropy, I directed divergence), L2 divergence, Rényi divergence, Tsallis divergence, Hellinger distance, Bhattacharyya distance, maximum mean discrepancy (kernel distance, an integral probability metric), J-distance (symmetrised Kullback-Leibler divergence, J divergence), Cauchy-Schwartz divergence, Euclidean distance based divergence, energy distance (specially the Cramer-Von Mises distance), Jensen-Shannon divergence, Jensen-Rényi divergence, K divergence, L divergence, f-divergence (Csiszár-Morimoto divergence, Ali-Silvey distance), non-symmetric Bregman distance (Bregman divergence), Jensen-Tsallis divergence, symmetric Bregman distance, Pearson chi square divergence (chi square distance), Sharma-Mittal divergence, </p> </li> <li><p><code>association measures</code>, including <code>measures of concordance</code>: multivariate extensions of Spearman's rho (Spearman's rank correlation coefficient, grade correlation coefficient), correntropy, centered correntropy, correntropy coefficient, correntropy induced metric, centered correntropy induced metric, multivariate extension of Blomqvist's beta (medial correlation coefficient), multivariate conditional version of Spearman's rho, lower/upper tail dependence via conditional Spearman's rho, </p> </li> <li><p><code>cross quantities</code>: cross-entropy, </p> </li> <li><p><code>kernels on distributions</code>: expected kernel (summation kernel, mean map kernel, set kernel, multi-instance kernel, ensemble kernel; special convolution kernel), Bhattacharyya kernel (Bhattacharyya coefficient, Hellinger affinity), probability product kernel, Jensen-Shannon kernel, exponentiated Jensen-Shannon kernel, Jensen-Tsallis kernel, exponentiated Jensen-Renyi kernel(s), exponentiated Jensen-Tsallis kernel(s). </p> </li> <li><p><code>conditional entropy</code>: conditional Shannon entropy. </p> </li> <li><p><code>conditional mutual information</code>: conditional Shannon mutual information. </p> </li> </ul> <p>ITE offers </p> <ul> <li><p>solvers for (i) Independent Subspace Analysis (ISA) and (ii) its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed models, as well as to systems with nonparametric source dynamics, </p> </li> <li><p>several consistency tests (analytical vs estimated value), </p> </li> <li><p>illustrations for information theoretical image registration, and </p> </li> <li><p>distribution regression with applications in (i) supervised entropy learning and (ii) aerosol optical depth prediction based on satellite images. </p> </li> </ul> <p>ITE is </p> <ul> <li> written in Matlab/Octave, </li> <li> multi-platform (tested extensively on Windows and Linux), </li> <li> free and open source (released under the GNU GPLv3(&gt;=) license). </li> </ul> <p>Note (new): Python implementation ( </p> <p>ITE mailing list: "". </p> <p>Follow ITE: </p> <ul> <li> on BitBucket: Python (, Matlab/Octave (, </li> <li> on Twitter ( </li> </ul> <p>Share your ITE application: "". </p></html>Zoltan SzaboThu, 09 Jun 2016 23:42:14 -0000 entropyconditional mutual informationentropymutual informationdivergenceindependent subspace analysisseparation principlesindependent process analysisassociation measuremeasure of concordancemeasure of independencenonparametric estimationdistribution regression