Sparsity-driven weighted ensemble classifier

In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned we...

Full description

Bibliographic Details
Main Authors: Atilla Özgandür, Fatih Nar, Hamit Erdem
Format: Article
Language:English
Published: Atlantis Press 2018-01-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://www.atlantis-press.com/article/25894608/view
id doaj-37afe43dcaf14c1b9243d3b3416946e4
record_format Article
spelling doaj-37afe43dcaf14c1b9243d3b3416946e42020-11-25T02:06:05ZengAtlantis PressInternational Journal of Computational Intelligence Systems 1875-68832018-01-0111110.2991/ijcis.11.1.73Sparsity-driven weighted ensemble classifierAtilla ÖzgandürFatih NarHamit ErdemIn this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.https://www.atlantis-press.com/article/25894608/viewMachine LearningEnsembleConvex RelaxationClassificationClassifier Ensembles
collection DOAJ
language English
format Article
sources DOAJ
author Atilla Özgandür
Fatih Nar
Hamit Erdem
spellingShingle Atilla Özgandür
Fatih Nar
Hamit Erdem
Sparsity-driven weighted ensemble classifier
International Journal of Computational Intelligence Systems
Machine Learning
Ensemble
Convex Relaxation
Classification
Classifier Ensembles
author_facet Atilla Özgandür
Fatih Nar
Hamit Erdem
author_sort Atilla Özgandür
title Sparsity-driven weighted ensemble classifier
title_short Sparsity-driven weighted ensemble classifier
title_full Sparsity-driven weighted ensemble classifier
title_fullStr Sparsity-driven weighted ensemble classifier
title_full_unstemmed Sparsity-driven weighted ensemble classifier
title_sort sparsity-driven weighted ensemble classifier
publisher Atlantis Press
series International Journal of Computational Intelligence Systems
issn 1875-6883
publishDate 2018-01-01
description In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.
topic Machine Learning
Ensemble
Convex Relaxation
Classification
Classifier Ensembles
url https://www.atlantis-press.com/article/25894608/view
work_keys_str_mv AT atillaozgandur sparsitydrivenweightedensembleclassifier
AT fatihnar sparsitydrivenweightedensembleclassifier
AT hamiterdem sparsitydrivenweightedensembleclassifier
_version_ 1724935192389353472