Summary: | The improved version of Boosted Decision Tree algorithm, named as Boosted Adaptive Apriori post-Pruned Decision Tree (Boosted AApoP-DT), was developed by referring to Adaptive Apriori (AA) properties and by using post-pruning technique. The post-pruning technique used is mainly the error-complexity pruning for the decision trees categorized under Classification and Regression Trees. This technique estimates the re-substitution, cross validation and generalization error rates before and after the post-pruning. The novelty of the post-pruning technique applied is that it is augmented by AA properties and these depend on the data characteristics in the dataset(s) being accessed. This algorithm is then boosted by using AdaBoost ensemble method. After comparing and contrasting this developed algorithm with the algorithm without being augmented by AA, i.e., Boosted post-Pruned Decision Tree (Boosted poP-DT), and the classical boosted decision tree algorithm, i.e., Boosted DT, there is a stepwise improvement shown when comparison proceeds from Boosted DT to Boosted poP-DT and to Boosted AApoP-DT.
|