Distributed Support Vector Machine Learning

Support Vector Machines (SVMs) are used for a growing number of applications. A fundamental constraint on SVM learning is the management of the training set. This is because the order of computations goes as the square of the size of the training set. Typically, training sets of 1000 (500 positives...

Full description

Bibliographic Details
Main Author: Armond, Kenneth C., Jr.
Format: Others
Published: ScholarWorks@UNO 2008
Subjects:
SVM
SMO
Online Access:http://scholarworks.uno.edu/td/711
http://scholarworks.uno.edu/cgi/viewcontent.cgi?article=1711&context=td
id ndltd-uno.edu-oai-scholarworks.uno.edu-td-1711
record_format oai_dc
spelling ndltd-uno.edu-oai-scholarworks.uno.edu-td-17112016-10-21T17:04:41Z Distributed Support Vector Machine Learning Armond, Kenneth C., Jr. Support Vector Machines (SVMs) are used for a growing number of applications. A fundamental constraint on SVM learning is the management of the training set. This is because the order of computations goes as the square of the size of the training set. Typically, training sets of 1000 (500 positives and 500 negatives, for example) can be managed on a PC without hard-drive thrashing. Training sets of 10,000 however, simply cannot be managed with PC-based resources. For this reason most SVM implementations must contend with some kind of chunking process to train parts of the data at a time (10 chunks of 1000, for example, to learn the 10,000). Sequential and multi-threaded chunking methods provide a way to run the SVM on large datasets while retaining accuracy. The multi-threaded distributed SVM described in this thesis is implemented using Java RMI, and has been developed to run on a network of multi-core/multi-processor computers. 2008-08-07T07:00:00Z text application/pdf http://scholarworks.uno.edu/td/711 http://scholarworks.uno.edu/cgi/viewcontent.cgi?article=1711&context=td University of New Orleans Theses and Dissertations ScholarWorks@UNO Distributed Parallel SVM Support Vector Machine Machine Learning SMO Sequential Minimization Optimization
collection NDLTD
format Others
sources NDLTD
topic Distributed
Parallel
SVM
Support Vector Machine
Machine Learning
SMO
Sequential Minimization Optimization
spellingShingle Distributed
Parallel
SVM
Support Vector Machine
Machine Learning
SMO
Sequential Minimization Optimization
Armond, Kenneth C., Jr.
Distributed Support Vector Machine Learning
description Support Vector Machines (SVMs) are used for a growing number of applications. A fundamental constraint on SVM learning is the management of the training set. This is because the order of computations goes as the square of the size of the training set. Typically, training sets of 1000 (500 positives and 500 negatives, for example) can be managed on a PC without hard-drive thrashing. Training sets of 10,000 however, simply cannot be managed with PC-based resources. For this reason most SVM implementations must contend with some kind of chunking process to train parts of the data at a time (10 chunks of 1000, for example, to learn the 10,000). Sequential and multi-threaded chunking methods provide a way to run the SVM on large datasets while retaining accuracy. The multi-threaded distributed SVM described in this thesis is implemented using Java RMI, and has been developed to run on a network of multi-core/multi-processor computers.
author Armond, Kenneth C., Jr.
author_facet Armond, Kenneth C., Jr.
author_sort Armond, Kenneth C., Jr.
title Distributed Support Vector Machine Learning
title_short Distributed Support Vector Machine Learning
title_full Distributed Support Vector Machine Learning
title_fullStr Distributed Support Vector Machine Learning
title_full_unstemmed Distributed Support Vector Machine Learning
title_sort distributed support vector machine learning
publisher ScholarWorks@UNO
publishDate 2008
url http://scholarworks.uno.edu/td/711
http://scholarworks.uno.edu/cgi/viewcontent.cgi?article=1711&context=td
work_keys_str_mv AT armondkennethcjr distributedsupportvectormachinelearning
_version_ 1718388009283354624