A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifica...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-03-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/20/3/185 |
id |
doaj-13ce1c0b9c4643cc83261a5fe735f29d |
---|---|
record_format |
Article |
spelling |
doaj-13ce1c0b9c4643cc83261a5fe735f29d2020-11-25T00:22:43ZengMDPI AGEntropy1099-43002018-03-0120318510.3390/e20030185e20030185A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with ApplicationsArnaud Marsiglietti0Victoria Kostina1Center for the Mathematics of Information, California Institute of Technology, Pasadena, CA 91125, USADepartment of Electrical Engineering, California Institute of Technology, Pasadena, CA 91125, USAWe derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d ( x , x ^ ) = | x − x ^ | r , with r ≥ 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log ( π e ) ≈ 1 . 5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log ( π e 2 ) ≈ 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log ( π e 2 ) ≈ 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry.http://www.mdpi.com/1099-4300/20/3/185differential entropyreverse entropy power inequalityrate-distortion functionShannon lower boundchannel capacitylog-concave distributionhyperplane conjecture |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Arnaud Marsiglietti Victoria Kostina |
spellingShingle |
Arnaud Marsiglietti Victoria Kostina A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications Entropy differential entropy reverse entropy power inequality rate-distortion function Shannon lower bound channel capacity log-concave distribution hyperplane conjecture |
author_facet |
Arnaud Marsiglietti Victoria Kostina |
author_sort |
Arnaud Marsiglietti |
title |
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_short |
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_full |
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_fullStr |
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_full_unstemmed |
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_sort |
lower bound on the differential entropy of log-concave random vectors with applications |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2018-03-01 |
description |
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure
d
(
x
,
x
^
)
=
|
x
−
x
^
|
r
, with
r
≥
1
, and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most
log
(
π
e
)
≈
1
.
5
bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most
log
(
π
e
2
)
≈
1
bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most
log
(
π
e
2
)
≈
1
bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. |
topic |
differential entropy reverse entropy power inequality rate-distortion function Shannon lower bound channel capacity log-concave distribution hyperplane conjecture |
url |
http://www.mdpi.com/1099-4300/20/3/185 |
work_keys_str_mv |
AT arnaudmarsiglietti alowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT victoriakostina alowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT arnaudmarsiglietti lowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT victoriakostina lowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications |
_version_ |
1725358675801931776 |