Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence

Cross entropy and Kullback⁻Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to maximizing likelihood, and thus, cross entrop...

Full description

Bibliographic Details
Main Authors: Mateu Sbert, Min Chen, Jordi Poch, Anton Bardera
Format: Article
Language:English
Published: MDPI AG 2018-12-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/20/12/959