Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update

碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuo...

Full description

Bibliographic Details
Main Authors: Bo-Wei Lin, 林柏維
Other Authors: Huan Chen
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394029%22.&searchmode=basic
Description
Summary:碩士 === 國立中興大學 === 資訊科學與工程學系所 === 107 === Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuously to train a model to simluate a function similar to the rule of original problem. Because the effect and efficiency of adjusting these parameter depends on the machanism of optimizer, the selection of optimizer determine the strategy of the simulated function. One of the optimizers used commonly is backpropagation. In addition to the original backpropagation, there is also a hybrid method by combining backpropagation and metaheuristic algorithm. Metaheuristic is a series of search algorithm to find the optimal solution in solution space. Becasue of the feature of metaheuristic, it is better than backpropagation in global search. Comparing metaheuristic to backpropagation in deep learning, metaheuristic is able to find a best solution in global search, and backpropagation perform well in local search. By combining these two methds, the hybrid metaheuristic-backpropagation is better than only metaheuristic and only backpropagation in global search and local search. However, the training time of metaheuristic-backpropagation optimizer is too long. This paper reduces training time by batch size update in regression problem. Finally, batch update get better result than original in complicated problem. Batch update enhance a little accuracy, and training time is reduced to 87.74% time cost.