Approximated Information Analysis in Bayesian Inference

In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and...

Full description

Bibliographic Details
Main Authors: Jung In Seo, Yongku Kim
Format: Article
Language:English
Published: MDPI AG 2015-03-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/17/3/1441
id doaj-529a6473de9541a7b217b412401df166
record_format Article
spelling doaj-529a6473de9541a7b217b412401df1662020-11-25T00:12:08ZengMDPI AGEntropy1099-43002015-03-011731441145110.3390/e17031441e17031441Approximated Information Analysis in Bayesian InferenceJung In Seo0Yongku Kim1Department of Statistics, Yeungnam University, Gyeongsan 712-749, KoreaDepartment of Statistics, Kyungpook National University, Daegu 702-701, KoreaIn models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.http://www.mdpi.com/1099-4300/17/3/1441Bayesian sensitivityGibbs samplerKullback–Leibler divergenceLaplace approximation
collection DOAJ
language English
format Article
sources DOAJ
author Jung In Seo
Yongku Kim
spellingShingle Jung In Seo
Yongku Kim
Approximated Information Analysis in Bayesian Inference
Entropy
Bayesian sensitivity
Gibbs sampler
Kullback–Leibler divergence
Laplace approximation
author_facet Jung In Seo
Yongku Kim
author_sort Jung In Seo
title Approximated Information Analysis in Bayesian Inference
title_short Approximated Information Analysis in Bayesian Inference
title_full Approximated Information Analysis in Bayesian Inference
title_fullStr Approximated Information Analysis in Bayesian Inference
title_full_unstemmed Approximated Information Analysis in Bayesian Inference
title_sort approximated information analysis in bayesian inference
publisher MDPI AG
series Entropy
issn 1099-4300
publishDate 2015-03-01
description In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.
topic Bayesian sensitivity
Gibbs sampler
Kullback–Leibler divergence
Laplace approximation
url http://www.mdpi.com/1099-4300/17/3/1441
work_keys_str_mv AT junginseo approximatedinformationanalysisinbayesianinference
AT yongkukim approximatedinformationanalysisinbayesianinference
_version_ 1725401039812689920