Support Vector Machines for Multi-label Classification

碩士 === 國立臺灣大學 === 資訊工程學研究所 === 94 === Multi-label classification is an important subject in machine learning. There are several available ways to handle such problems. In this thesis we focus on using support vector machines (SVMs). As multi-label classification can be treated as an extension of mul...

Full description

Bibliographic Details
Main Authors: Wen-Hsien Su, 蘇玟賢
Other Authors: Chih-Jen Lin
Format: Others
Language:en_US
Published: 2006
Online Access:http://ndltd.ncl.edu.tw/handle/10211135656094666705
Description
Summary:碩士 === 國立臺灣大學 === 資訊工程學研究所 === 94 === Multi-label classification is an important subject in machine learning. There are several available ways to handle such problems. In this thesis we focus on using support vector machines (SVMs). As multi-label classification can be treated as an extension of multi-class classification, it is natural to modify multi-class approaches for multi-label problems. The thesis considers three extensions: “binary,” “label combination” and maximal margin formulation. We give comprehensive experiments to check their performances. In addition, we also give detailed derivations and investigate the implementation details. As “label combination” is a way that treats each subset of labels as an individual SVM class, any multi-class method can be directly applied. We discuss several methods of this type. They are “one-against-one,” “approach in [45, 46],” and “method by Crammer and Singer.” We compare and analyze their performances. The last two methods both solve a single optimization problem in training. We find that they perform well when the size of the data is not large. They are however not suitable for very large problems due to lengthy training time. In such situations, the “label combination” approach via “one-against-one” multi-class implementation is an effective solution. Overall we find that the method “label combination” to directly transform multi-label to multi-class is a practically viable technique.