Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding

We provide a new derivation of the log likelihood spectral distance measure for signal processing using the logarithm of the ratio of entropy rate powers. Using this interpretation, we show that the log likelihood ratio is equivalent to the difference of two differential entropies, and further that...

Full description

Bibliographic Details
Main Authors: Jerry D. Gibson, Preethi Mahadevan
Format: Article
Language:English
Published: MDPI AG 2017-09-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/19/9/496