Summary: | The demand for recommender systems in E-commerce industry has increased tremendously. Efficient recommender systems are being proposed by different E-business companies with the intention to give users accurate and most relevant recommendation of products from huge amount of information. To improve the performance of recommender systems, various stochastic variants of gradient descent based algorithms have been reported. The scalability requirement of recommender systems needs algorithms with fast convergence to generate recommendations of specific items. Using the concepts of fractional calculus, an efficient variant of the stochastic gradient descent (SGD) was developed for fast convergence. Such fractional SGD (F-SGD) is further accelerated by adding a momentum term, thus termed as momentum fractional stochastic gradient descent (mF-SGD). The proposed mF-SGD method is shown to offer improved estimation accuracy and convergence rate, as compared to F-SGD and standard momentum SGD for different proportions of previous gradients, fractional orders, learning rates and number of features.
|