Quadratic Mutual Information Feature Selection
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order n...
Main Authors: | Davor Sluga, Uroš Lotrič |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-04-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/19/4/157 |
Similar Items
-
Scalable information-optimal compressive target recognition
by: Kerviche, Ronan, et al.
Published: (2016) -
Feature selection based on fuzzy joint mutual information maximization
by: Omar A. M. Salem, et al.
Published: (2021-04-01) -
A Mutual Information estimator for continuous and discrete variables applied to Feature Selection and Classification problems
by: Frederico Coelho, et al.
Published: (2016-08-01) -
Input Feature Selection Method Based on Feature Set Equivalence and Mutual Information Gain Maximization
by: Xinzheng Wang, et al.
Published: (2019-01-01) -
Feature Selection with Conditional Mutual Information Considering Feature Interaction
by: Jun Liang, et al.
Published: (2019-07-01)