On Generalized Measures Of Information With Maximum And Minimum Entropy Prescriptions

Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where P and R are probability measures on a measurable space (X, ), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case de...

Full description

Bibliographic Details
Main Author: Dukkipati, Ambedkar
Other Authors: Narasimha Murty, M
Language:en_US
Published: 2008
Subjects:
Online Access:http://hdl.handle.net/2005/353