Burg Matrix Divergence-Based Hierarchical Distance Metric Learning for Binary Classification

Distance metric learning is the foundation of many learning algorithms, and it has been widely used in many real-world applications. The basic idea of most distance metric learning methods is to find a space that can optimally separate data points that belong to different categories. However, curren...

Full description

Bibliographic Details
Main Authors: Yan Wang, Han-Xiong Li
Format: Article
Language:English
Published: IEEE 2017-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/7858659/
Description
Summary:Distance metric learning is the foundation of many learning algorithms, and it has been widely used in many real-world applications. The basic idea of most distance metric learning methods is to find a space that can optimally separate data points that belong to different categories. However, current methods are mostly based on the single space that only learns one Mahalanobis distance for each data set, which actually fails to perfectly separate different categories in most real-world applications. To improve the accuracy of binary classification, a hierarchical method is proposed in this paper to completely separate different categories by sequentially learning subspace distance metrics. In the proposed method, a base-space distance metric is learned based on a similarity constraint first. Then, for binary classification problems, we formulate the subspace learning problem as a particular Burg Matrix optimization problem that minimizes the Burg Matrix divergence with distance constraints. Moreover, a cyclic projection algorithm is presented to solve the subspace learning problems. The experiments on five UCI data sets using different performance indices demonstrate the improved performance of the proposed method when compared with the state-of-the-art methods.
ISSN:2169-3536