The Kullback–Leibler Information Function for Infinite Measures

In this paper, we introduce the Kullback–Leibler information function ρ ( ν , μ ) and prove the local large deviation principle for σ-finite measures μ and finitely additive probability measures ν. In particular, the entropy of a continuous probability distribution ν on the real axis is inte...

Full description

Bibliographic Details
Main Authors: Victor Bakhtin, Edvard Sokal
Format: Article
Language:English
Published: MDPI AG 2016-12-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/18/12/448
id doaj-2fef9aa7b37f43bdadb9d0abe97362a9
record_format Article
spelling doaj-2fef9aa7b37f43bdadb9d0abe97362a92020-11-24T22:22:40ZengMDPI AGEntropy1099-43002016-12-01181244810.3390/e18120448e18120448The Kullback–Leibler Information Function for Infinite MeasuresVictor Bakhtin0Edvard Sokal1Department of Mathematics, IT and Landscape Architecture, John Paul II Catholic University of Lublin, Konstantynuv Str. 1H, 20-708 Lublin, PolandDepartment of Mechanics and Mathematics, Belarusian State University, Nezavisimosti Ave. 4, 220030 Minsk, BelarusIn this paper, we introduce the Kullback–Leibler information function ρ ( ν , μ ) and prove the local large deviation principle for σ-finite measures μ and finitely additive probability measures ν. In particular, the entropy of a continuous probability distribution ν on the real axis is interpreted as the exponential rate of asymptotics for the Lebesgue measure of the set of those samples that generate empirical measures close to ν in a suitable fine topology.http://www.mdpi.com/1099-4300/18/12/448Kullback–Leibler information functionentropylarge deviation principleempirical measurefine topologyspectral potential
collection DOAJ
language English
format Article
sources DOAJ
author Victor Bakhtin
Edvard Sokal
spellingShingle Victor Bakhtin
Edvard Sokal
The Kullback–Leibler Information Function for Infinite Measures
Entropy
Kullback–Leibler information function
entropy
large deviation principle
empirical measure
fine topology
spectral potential
author_facet Victor Bakhtin
Edvard Sokal
author_sort Victor Bakhtin
title The Kullback–Leibler Information Function for Infinite Measures
title_short The Kullback–Leibler Information Function for Infinite Measures
title_full The Kullback–Leibler Information Function for Infinite Measures
title_fullStr The Kullback–Leibler Information Function for Infinite Measures
title_full_unstemmed The Kullback–Leibler Information Function for Infinite Measures
title_sort kullback–leibler information function for infinite measures
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2016-12-01
description In this paper, we introduce the Kullback–Leibler information function ρ ( ν , μ ) and prove the local large deviation principle for σ-finite measures μ and finitely additive probability measures ν. In particular, the entropy of a continuous probability distribution ν on the real axis is interpreted as the exponential rate of asymptotics for the Lebesgue measure of the set of those samples that generate empirical measures close to ν in a suitable fine topology.
topic Kullback–Leibler information function
entropy
large deviation principle
empirical measure
fine topology
spectral potential
url http://www.mdpi.com/1099-4300/18/12/448
work_keys_str_mv AT victorbakhtin thekullbackleiblerinformationfunctionforinfinitemeasures
AT edvardsokal thekullbackleiblerinformationfunctionforinfinitemeasures
AT victorbakhtin kullbackleiblerinformationfunctionforinfinitemeasures
AT edvardsokal kullbackleiblerinformationfunctionforinfinitemeasures
_version_ 1725767262523097088