From AdaBoost to AHCBoost

博士 === 國立東華大學 === 應用數學系 === 97 === Among the ensemble classifiers emerging in the past 10 years, Boosting is the one caught much attention from both statistics and machine learning societies. Friedman, Hastie and Tibishirani (2000) views AdaBoost as a Newton-like updates minimizing exponential crite...

Full description

Bibliographic Details
Main Authors: Wu-Chiang Chen, 陳武強
Other Authors: Chen-Hai Andy Tsao
Format: Others
Language:en_US
Published: 2009
Online Access:http://ndltd.ncl.edu.tw/handle/90269282006462480524
Description
Summary:博士 === 國立東華大學 === 應用數學系 === 97 === Among the ensemble classifiers emerging in the past 10 years, Boosting is the one caught much attention from both statistics and machine learning societies. Friedman, Hastie and Tibishirani (2000) views AdaBoost as a Newton-like updates minimizing exponential criterion. However, the convergence of iterative procedure is not addressed. In this thesis, we cast the learning problem as a Bayesian minimization problem under a normal-normal setting. It is shown that the Bayes procedure can be obtained via an iterative Newton update minimizing exponential criterion. With the step sizes of AdaBoost it is highly effective and leads to a one-step convergence. Moreover, to apply Boosting to multi-level classification problems, we introduce an adjustable hyperbolic cosine loss to replace the exponential loss and develop a new Boosting algorithm, AHCBoost. The corresponding Bayesian procedure and Bayes risk of population version of AHCBoost can be obtained. We also find that, based on the experiments of some benchmark datasets, the sample version of AHCBoost works well for multi-level classification problems.