New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures

碩士 === 國立臺灣科技大學 === 資訊工程系 === 94 === Classification techniques have been widely applied in many domains. In this thesis, we propose two new methods for handling classification problems. The first method selects feature subsets for handling classification problems based on fuzzy entropy measures focu...

Full description

Bibliographic Details
Main Authors: Jen-Da Shie, 謝政達
Other Authors: Shyi-Ming Chen
Format: Others
Language:zh-TW
Published: 2005
Online Access:http://ndltd.ncl.edu.tw/handle/93504009282990968312
id ndltd-TW-094NTUST392004
record_format oai_dc
spelling ndltd-TW-094NTUST3920042015-12-23T04:08:00Z http://ndltd.ncl.edu.tw/handle/93504009282990968312 New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures 根據模糊亂度量測法及模糊資訊增益量測法以處理分類問題之新方法 Jen-Da Shie 謝政達 碩士 國立臺灣科技大學 資訊工程系 94 Classification techniques have been widely applied in many domains. In this thesis, we propose two new methods for handling classification problems. The first method selects feature subsets for handling classification problems based on fuzzy entropy measures focusing on boundary samples. It can deal with both numeric and nominal features. It can select relevant features to get higher average classification accuracy rates than the ones selected by the existed methods. The second method handles classification problems based on fuzzy information gain measures. It can deal with both numeric and nominal features. It can get higher average classification accuracy rates than the existed methods. Shyi-Ming Chen 陳錫明 2005 學位論文 ; thesis 120 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立臺灣科技大學 === 資訊工程系 === 94 === Classification techniques have been widely applied in many domains. In this thesis, we propose two new methods for handling classification problems. The first method selects feature subsets for handling classification problems based on fuzzy entropy measures focusing on boundary samples. It can deal with both numeric and nominal features. It can select relevant features to get higher average classification accuracy rates than the ones selected by the existed methods. The second method handles classification problems based on fuzzy information gain measures. It can deal with both numeric and nominal features. It can get higher average classification accuracy rates than the existed methods.
author2 Shyi-Ming Chen
author_facet Shyi-Ming Chen
Jen-Da Shie
謝政達
author Jen-Da Shie
謝政達
spellingShingle Jen-Da Shie
謝政達
New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
author_sort Jen-Da Shie
title New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
title_short New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
title_full New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
title_fullStr New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
title_full_unstemmed New Methods for Handling Classification Problems Based on Fuzzy Entropy Measures and Fuzzy Information Gain Measures
title_sort new methods for handling classification problems based on fuzzy entropy measures and fuzzy information gain measures
publishDate 2005
url http://ndltd.ncl.edu.tw/handle/93504009282990968312
work_keys_str_mv AT jendashie newmethodsforhandlingclassificationproblemsbasedonfuzzyentropymeasuresandfuzzyinformationgainmeasures
AT xièzhèngdá newmethodsforhandlingclassificationproblemsbasedonfuzzyentropymeasuresandfuzzyinformationgainmeasures
AT jendashie gēnjùmóhúluàndùliàngcèfǎjímóhúzīxùnzēngyìliàngcèfǎyǐchùlǐfēnlèiwèntízhīxīnfāngfǎ
AT xièzhèngdá gēnjùmóhúluàndùliàngcèfǎjímóhúzīxùnzēngyìliàngcèfǎyǐchùlǐfēnlèiwèntízhīxīnfāngfǎ
_version_ 1718156212148633600