Joint mean-covariance modelling and variable selection for longitudinal data analysis

Briefly saying, in this thesis, I endeavor to deliver both the parametric and non-parametric modelling and selection tools for longitudinal data analysis. The first part of my work, is to extend the GEEs with random effects into the joint modelling of longitudinal data. This is a parametric approach...

Full description

Bibliographic Details
Main Author: Huang, Chao
Other Authors: Pan, Jianxin
Published: University of Manchester 2011
Subjects:
Online Access:http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.532189
id ndltd-bl.uk-oai-ethos.bl.uk-532189
record_format oai_dc
spelling ndltd-bl.uk-oai-ethos.bl.uk-5321892017-07-25T03:22:44ZJoint mean-covariance modelling and variable selection for longitudinal data analysisHuang, ChaoPan, Jianxin2011Briefly saying, in this thesis, I endeavor to deliver both the parametric and non-parametric modelling and selection tools for longitudinal data analysis. The first part of my work, is to extend the GEEs with random effects into the joint modelling of longitudinal data. This is a parametric approach in which the heterogeneity and heteroscedasticity for different individuals are taken into account. With the only assumption about the existence of the first four order moments of the responses, random effects are treated as a kind of penalty in the extended GEEs. This approach includes both the virtues of GEEs and joint modelling with random effects. The modified Cholesky decomposition is used here for joint modelling because it has an explicit statistical interpretation. This work could be applied to the longitudinal data analysis in which the individual performance is of our main interest. The second part of this thesis, dedicates to the selection of random effects in the Generalized Linear Mixed Model (GLMM). In this work, the penalized functions are implemented into the selection of random effects covariance components. And the Penalized Quasi-Likelihood (PQL) is recruited to deal with the integration of likelihood. When nonzero random effects covariance components are selected, their corresponding random effects are selected and other zero ones are eliminated. A backfitting algorithm is proposed here for variable estimation and the leave-one-subject-out CV (SCV) method is used to select the optimal value of tuning parameter in penalty function. This work is valuable in the aspect that random effects could also be parsimoniously selected with penalty functions. Besides, extension of this work to the selection of both fixed-effects and random effects are quite straightforward and therefore applicable to a more general area. The last part of this thesis aims to utilize a nonparametric data-driven approach, i.e., polynomial techniques to analyze the longitudinal data. Also based on the modified Cholesky decomposition, the within subject covariance matrix is decomposed into a unit lower triangle matrix involving generalized autoregressive coefficients and a diagonal matrix involving innovation variances. Local polynomial smoothing estimation is proposed to model the nonparametric smoothing functions of mean, generalized autoregressive parameters and log-innovation variance, simultaneously. The leave-one-subject-out CV (SCV) method is also implemented for the bandwidth selection. This work is creative in joint-modelling mean and covariance parameters by local polynomial method with the modified Choleskey decomposition. Besides, the proposed approach shows the robustness in computation in application.519.5University of Manchesterhttp://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.532189https://www.research.manchester.ac.uk/portal/en/theses/joint-meancovariance-modelling-and-variable-selection-for-longitudinal-data-analysis(b149a7f7-ec34-4709-99f0-e96f4057533a).htmlElectronic Thesis or Dissertation
collection NDLTD
sources NDLTD
topic 519.5
spellingShingle 519.5
Huang, Chao
Joint mean-covariance modelling and variable selection for longitudinal data analysis
description Briefly saying, in this thesis, I endeavor to deliver both the parametric and non-parametric modelling and selection tools for longitudinal data analysis. The first part of my work, is to extend the GEEs with random effects into the joint modelling of longitudinal data. This is a parametric approach in which the heterogeneity and heteroscedasticity for different individuals are taken into account. With the only assumption about the existence of the first four order moments of the responses, random effects are treated as a kind of penalty in the extended GEEs. This approach includes both the virtues of GEEs and joint modelling with random effects. The modified Cholesky decomposition is used here for joint modelling because it has an explicit statistical interpretation. This work could be applied to the longitudinal data analysis in which the individual performance is of our main interest. The second part of this thesis, dedicates to the selection of random effects in the Generalized Linear Mixed Model (GLMM). In this work, the penalized functions are implemented into the selection of random effects covariance components. And the Penalized Quasi-Likelihood (PQL) is recruited to deal with the integration of likelihood. When nonzero random effects covariance components are selected, their corresponding random effects are selected and other zero ones are eliminated. A backfitting algorithm is proposed here for variable estimation and the leave-one-subject-out CV (SCV) method is used to select the optimal value of tuning parameter in penalty function. This work is valuable in the aspect that random effects could also be parsimoniously selected with penalty functions. Besides, extension of this work to the selection of both fixed-effects and random effects are quite straightforward and therefore applicable to a more general area. The last part of this thesis aims to utilize a nonparametric data-driven approach, i.e., polynomial techniques to analyze the longitudinal data. Also based on the modified Cholesky decomposition, the within subject covariance matrix is decomposed into a unit lower triangle matrix involving generalized autoregressive coefficients and a diagonal matrix involving innovation variances. Local polynomial smoothing estimation is proposed to model the nonparametric smoothing functions of mean, generalized autoregressive parameters and log-innovation variance, simultaneously. The leave-one-subject-out CV (SCV) method is also implemented for the bandwidth selection. This work is creative in joint-modelling mean and covariance parameters by local polynomial method with the modified Choleskey decomposition. Besides, the proposed approach shows the robustness in computation in application.
author2 Pan, Jianxin
author_facet Pan, Jianxin
Huang, Chao
author Huang, Chao
author_sort Huang, Chao
title Joint mean-covariance modelling and variable selection for longitudinal data analysis
title_short Joint mean-covariance modelling and variable selection for longitudinal data analysis
title_full Joint mean-covariance modelling and variable selection for longitudinal data analysis
title_fullStr Joint mean-covariance modelling and variable selection for longitudinal data analysis
title_full_unstemmed Joint mean-covariance modelling and variable selection for longitudinal data analysis
title_sort joint mean-covariance modelling and variable selection for longitudinal data analysis
publisher University of Manchester
publishDate 2011
url http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.532189
work_keys_str_mv AT huangchao jointmeancovariancemodellingandvariableselectionforlongitudinaldataanalysis
_version_ 1718503774044028928