On the Measurement of Randomness (Uncertainty): A More Informative Entropy

As a measure of randomness or uncertainty, the Boltzmann–Shannon entropy H has become one of the most widely used summary measures of a variety of attributes (characteristics) in different disciplines. This paper points out an often overlooked limitation of H: comparisons between differences in H-va...

Full description

Bibliographic Details
Main Author: Tarald O. Kvålseth
Format: Article
Language:English
Published: MDPI AG 2016-04-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/18/5/159
Description
Summary:As a measure of randomness or uncertainty, the Boltzmann–Shannon entropy H has become one of the most widely used summary measures of a variety of attributes (characteristics) in different disciplines. This paper points out an often overlooked limitation of H: comparisons between differences in H-values are not valid. An alternative entropy H K is introduced as a preferred member of a new family of entropies for which difference comparisons are proved to be valid by satisfying a given value-validity condition. The H K is shown to have the appropriate properties for a randomness (uncertainty) measure, including a close linear relationship to a measurement criterion based on the Euclidean distance between probability distributions. This last point is demonstrated by means of computer generated random distributions. The results are also compared with those of another member of the entropy family. A statistical inference procedure for the entropy H K is formulated.
ISSN:1099-4300