Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduc...
Main Authors: | Maria Teresa Giraudo, Laura Sacerdote, Roberta Sirovich |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2013-11-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/15/12/5154 |
Similar Items
-
Information Entropy Suggests Stronger Nonlinear Associations between Hydro-Meteorological Variables and ENSO
by: Tue M. Vu, et al.
Published: (2018-01-01) -
Nonparametric estimation of the measure of functional dependence
by: Qingsong Shan, et al.
Published: (2021-09-01) -
Superpixel Segmentation of Hyperspectral Images Based on Entropy and Mutual Information
by: Lianlei Lin, et al.
Published: (2020-02-01) -
Empirical Estimation of Information Measures: A Literature Guide
by: Sergio Verdú
Published: (2019-07-01) -
Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples
by: Carlos A. L. Pires, et al.
Published: (2013-02-01)