Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 105 === Nonlinear mappings have long been used in data classification to handle linearly inseparable problems. Low-degree polynomial mappings are a widely used one among them, which enjoys less time and space consumption and may sometimes achieve accuracy close to that...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2017
|
Online Access: | http://ndltd.ncl.edu.tw/handle/dp8zxc |
id |
ndltd-TW-105NTU05392090 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-105NTU053920902019-05-15T23:39:39Z http://ndltd.ncl.edu.tw/handle/dp8zxc Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification 大規模線性分類資料低階多項式映射中雜湊函數之應用 Xiaocong Zhou 周驍聰 碩士 國立臺灣大學 資訊工程學研究所 105 Nonlinear mappings have long been used in data classification to handle linearly inseparable problems. Low-degree polynomial mappings are a widely used one among them, which enjoys less time and space consumption and may sometimes achieve accuracy close to that of using highly nonlinear kernels. However, the explicit form of polynomially mapped data for large data sets can also meet memory or computational difficulties. To solve this, hash functions like murmur and fnv hash are used in some packages like vowpal wabbit to have flexible memory usage. In this thesis, we propose a new hash function which is faster and could achieve the same performance. The results are validated in experiments on many datasets. Chih-Jen Lin 林智仁 2017 學位論文 ; thesis 28 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 105 === Nonlinear mappings have long been used in data classification to handle linearly inseparable problems. Low-degree polynomial mappings are a widely used one among them, which enjoys less time and space consumption and may sometimes achieve accuracy close to that of using highly nonlinear kernels. However, the explicit form of polynomially mapped data for large data sets can also meet memory or computational difficulties. To solve this, hash functions like murmur and fnv hash are used in some packages like vowpal wabbit to have flexible memory usage. In this thesis, we propose a new hash function which is faster and could achieve the same performance. The results are validated in experiments on many datasets.
|
author2 |
Chih-Jen Lin |
author_facet |
Chih-Jen Lin Xiaocong Zhou 周驍聰 |
author |
Xiaocong Zhou 周驍聰 |
spellingShingle |
Xiaocong Zhou 周驍聰 Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
author_sort |
Xiaocong Zhou |
title |
Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
title_short |
Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
title_full |
Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
title_fullStr |
Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
title_full_unstemmed |
Hash Functions for Polynomial Feature Mapping in Large Scale Linear Classification |
title_sort |
hash functions for polynomial feature mapping in large scale linear classification |
publishDate |
2017 |
url |
http://ndltd.ncl.edu.tw/handle/dp8zxc |
work_keys_str_mv |
AT xiaocongzhou hashfunctionsforpolynomialfeaturemappinginlargescalelinearclassification AT zhōuxiāocōng hashfunctionsforpolynomialfeaturemappinginlargescalelinearclassification AT xiaocongzhou dàguīmóxiànxìngfēnlèizīliàodījiēduōxiàngshìyìngshèzhōngzácòuhánshùzhīyīngyòng AT zhōuxiāocōng dàguīmóxiànxìngfēnlèizīliàodījiēduōxiàngshìyìngshèzhōngzácòuhánshùzhīyīngyòng |
_version_ |
1719151765477130240 |