Subset Selection of ARMA Models via a More Robust Choice of LASSO

碩士 === 東海大學 === 統計學系 === 104 === In statistical model-building, keeping accuracy of model prediction and choosing significant variables are essential topics. Least Absolute Shrinkage and Selection Operator (Lasso) can improve model prediction and find out significant variables. By further using the...

Full description

Bibliographic Details
Main Authors: Hsieh, Shang-Ju, 謝尚儒
Other Authors: Huang, Yu-Min
Format: Others
Language:zh-TW
Published: 2015
Online Access:http://ndltd.ncl.edu.tw/handle/264jg3
Description
Summary:碩士 === 東海大學 === 統計學系 === 104 === In statistical model-building, keeping accuracy of model prediction and choosing significant variables are essential topics. Least Absolute Shrinkage and Selection Operator (Lasso) can improve model prediction and find out significant variables. By further using the adaptive Lasso, the estimate would enjoy the oracle properties. In time series analysis, we can now fit an ARMA model via the adaptive Lasso, where either the OLS estimates or fitted coefficients from ARMA models are commonly taken as their adaptive weights. However, directly using OLS estimates as weights may lead to highly variant estimation for some stationary ARMA models, in particular when the models are highly stationary and/or possess higher lagged orders. In this work, we proposed a more robust adaptive Lasso for ARMA models. By better adapting inherent features, our approach, regarding in various degrees of stationarity, provides more robust solutions for ARMA models, where the Lasso estimates tend to be more confined around the true models. We illustrate our approach with a number of simulation examples and a real application to closing prices of Taiwan Top50 Exchange Tracker Fund (ETF50).