Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model
This dissertation consists of four independent but related parts, each in a Chapter. The first part is an introductory. It serves as the background introduction and offer preparations for later parts. The second part discusses two population multivariate normal distributions with common covariance m...
Main Author: | |
---|---|
Other Authors: | |
Format: | Others |
Published: |
Virginia Tech
2014
|
Subjects: | |
Online Access: | http://hdl.handle.net/10919/28121 http://scholar.lib.vt.edu/theses/available/etd-06252008-155353/ |
id |
ndltd-VTETD-oai-vtechworks.lib.vt.edu-10919-28121 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-VTETD-oai-vtechworks.lib.vt.edu-10919-281212020-09-26T05:30:32Z Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model Li, Zhonggai Statistics Du, Pang Morgan, John P. Smith, Eric P. Sun, Dongchu Multivariate Normal Distributions Monte Carlo Star-shape Gaussian Graphical Model Objective Priors Jeffreys' Priors Reference Priors Invariant Haar Prior Fisher Information Matrix Frequentist Matching Kullback-Liebler Divergence This dissertation consists of four independent but related parts, each in a Chapter. The first part is an introductory. It serves as the background introduction and offer preparations for later parts. The second part discusses two population multivariate normal distributions with common covariance matrix. The goal for this part is to derive objective/non-informative priors for the parameterizations and use these priors to build up constructive random posteriors of the Kullback-Liebler (KL) divergence of the two multivariate normal populations, which is proportional to the distance between the two means, weighted by the common precision matrix. We use the Cholesky decomposition for re-parameterization of the precision matrix. The KL divergence is a true distance measurement for divergence between the two multivariate normal populations with common covariance matrix. Frequentist properties of the Bayesian procedure using these objective priors are studied through analytical and numerical tools. The third part considers the star-shape Gaussian graphical model, which is a special case of undirected Gaussian graphical models. It is a multivariate normal distribution where the variables are grouped into one "global" group of variable set and several "local" groups of variable set. When conditioned on the global variable set, the local variable sets are independent of each other. We adopt the Cholesky decomposition for re-parametrization of precision matrix and derive Jeffreys' prior, reference prior, and invariant priors for new parameterizations. The frequentist properties of the Bayesian procedure using these objective priors are also studied. The last part concentrates on the discussion of objective Bayesian analysis for partial correlation coefficient and its application to multivariate Gaussian models. Ph. D. 2014-03-14T20:13:29Z 2014-03-14T20:13:29Z 2008-06-18 2008-06-25 2008-07-22 2008-07-22 Dissertation etd-06252008-155353 http://hdl.handle.net/10919/28121 http://scholar.lib.vt.edu/theses/available/etd-06252008-155353/ ZhonggaiLI_ETD.pdf In Copyright http://rightsstatements.org/vocab/InC/1.0/ application/pdf Virginia Tech |
collection |
NDLTD |
format |
Others
|
sources |
NDLTD |
topic |
Multivariate Normal Distributions Monte Carlo Star-shape Gaussian Graphical Model Objective Priors Jeffreys' Priors Reference Priors Invariant Haar Prior Fisher Information Matrix Frequentist Matching Kullback-Liebler Divergence |
spellingShingle |
Multivariate Normal Distributions Monte Carlo Star-shape Gaussian Graphical Model Objective Priors Jeffreys' Priors Reference Priors Invariant Haar Prior Fisher Information Matrix Frequentist Matching Kullback-Liebler Divergence Li, Zhonggai Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
description |
This dissertation consists of four independent but related parts, each in a Chapter. The first part is an introductory. It serves as the background introduction and offer preparations for later parts. The second part discusses two population multivariate normal distributions with common covariance matrix. The goal for this part is to derive objective/non-informative priors for the parameterizations and use these priors to build up constructive random posteriors of the Kullback-Liebler (KL) divergence of the two multivariate normal populations, which is proportional to the distance between the two means, weighted by the common precision matrix. We use the Cholesky decomposition for re-parameterization of the precision matrix. The KL divergence is a true distance measurement for divergence between the two multivariate normal populations with common covariance matrix. Frequentist properties of the Bayesian procedure using these objective priors are studied through analytical and numerical tools. The third part considers the star-shape Gaussian graphical model, which is a special case of undirected Gaussian graphical models. It is a multivariate normal distribution where the variables are grouped into one "global" group of variable set and several "local" groups of variable set. When conditioned on the global variable set, the local variable sets are independent of each other. We adopt the Cholesky decomposition for re-parametrization of precision matrix and derive Jeffreys' prior, reference prior, and invariant priors for new parameterizations. The frequentist properties of the Bayesian procedure using these objective priors are also studied. The last part concentrates on the discussion of objective Bayesian analysis for partial correlation coefficient and its application to multivariate Gaussian models. === Ph. D. |
author2 |
Statistics |
author_facet |
Statistics Li, Zhonggai |
author |
Li, Zhonggai |
author_sort |
Li, Zhonggai |
title |
Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
title_short |
Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
title_full |
Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
title_fullStr |
Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
title_full_unstemmed |
Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model |
title_sort |
objective bayesian analysis of kullback-liebler divergence of two multivariate normal distributions with common covariance matrix and star-shape gaussian graphical model |
publisher |
Virginia Tech |
publishDate |
2014 |
url |
http://hdl.handle.net/10919/28121 http://scholar.lib.vt.edu/theses/available/etd-06252008-155353/ |
work_keys_str_mv |
AT lizhonggai objectivebayesiananalysisofkullbacklieblerdivergenceoftwomultivariatenormaldistributionswithcommoncovariancematrixandstarshapegaussiangraphicalmodel |
_version_ |
1719340602435305472 |