Some Dissimilarity Measures of Branching Processes and Optimal Decision Making in the Presence of Potential Pandemics
We compute exact values respectively bounds of dissimilarity/distinguishability measures–in the sense of the Kullback-Leibler information distance (relative entropy) and some transforms of more general power divergences and Renyi divergences–between two competing discrete-time <i>Galton-Watson...
Main Authors: | Niels B. Kammerer, Wolfgang Stummer |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-08-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/8/874 |
Similar Items
-
On Renyi Divergence Measures for Continuous Alphabet Sources
by: GIL, MANUEL
Published: (2011) -
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
by: Wentao Huang, et al.
Published: (2019-03-01) -
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
by: Frank Nielsen
Published: (2020-06-01) -
Canonical Divergence for Flat <i>α</i>-Connections: Classical and Quantum
by: Domenico Felice, et al.
Published: (2019-08-01) -
Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions
by: Reinaldo B. Arellano-Valle, et al.
Published: (2012-09-01)