Summary: | 碩士 === 國立臺灣科技大學 === 資訊工程系 === 98 === In this thesis, we proposed two Hessian inverse updating strategies to improve the computational cost of finding Hessian inverse on SSVM and some implementation details to speed-up the performance of Hessian inverse updating and computing Newton direction. In addition, there are some interesting things in Chapter 4.
In our experiments, we demonstrate the effectiveness and speed of SSVM with the Hessian inverse updating strategies by comparing it numerically with SSVM without Hessian inverse updating strategies and LIBSVM.
In addition, our proposed updating strategies not only can apply to updating Hessian inverse efficiently for SSVM, but also can extend to the variants of the SVMs. For example, to improve the performance of the training and performing cross-validation for SVM in the primal and LSSVM, etc.
Furthermore, the Hessian inverse updating strategies is consists of incremental/decremental procedure. Therefore, it can be one of the incremental and decremental SVM types. When the instance is added, it can efficiently update the Hessian inverse by the Hessian inverse updating strategies. In future work, we try to develop a single pass online learning algorithm by our proposed Hessian inverse updating strategies.
|