Simple Stopping Criteria for Information Theoretic Feature Selection
Feature selection aims to select the smallest feature subset that yields the minimum generalization error. In the rich literature in feature selection, information theory-based approaches seek a subset of features such that the mutual information between the selected features and the class labels is...
Main Authors: | Shujian Yu, José C. Príncipe |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-01-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/1/99 |
Similar Items
-
Conditional Rényi Entropy and the Relationships between Rényi Capacities
by: Gautam Aishwarya, et al.
Published: (2020-05-01) -
On a General Definition of Conditional Rényi Entropies
by: Velimir M. Ilić, et al.
Published: (2017-11-01) -
Conditional Rényi Divergence Saddlepoint and the Maximization of <i>α</i>-Mutual Information
by: Changxiao Cai, et al.
Published: (2019-10-01) -
Rényi Entropy and Rényi Divergence in Product MV-Algebras
by: Dagmar Markechová, et al.
Published: (2018-08-01) -
Conditional Rényi Divergences and Horse Betting
by: Cédric Bleuler, et al.
Published: (2020-03-01)