The Improved Stochastic Fractional Order Gradient Descent Algorithm

This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are...

詳細記述

書誌詳細
出版年:Fractal and Fractional
主要な著者: Yang Yang, Lipo Mo, Yusen Hu, Fei Long
フォーマット: 論文
言語:英語
出版事項: MDPI AG 2023-08-01
主題:
オンライン・アクセス:https://www.mdpi.com/2504-3110/7/8/631