Characterizing the Asymptotic Per-Symbol Redundancy of Memoryless Sources over Countable Alphabets in Terms of Single-Letter Marginals

The minimum expected number of bits needed to describe a random variable is its entropy, assuming knowledge of the distribution of the random variable. On the other hand, universal compression describes data supposing that the underlying distribution is unknown, but that it belongs to a known set Ρ...

Full description

Bibliographic Details
Main Authors: Maryam Hosseini, Narayana Santhanam
Format: Article
Language:English
Published: MDPI AG 2014-07-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/7/4168