A New Hybrid Three-Term Conjugate Gradient Algorithm for Large-Scale Unconstrained Problems

Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper,...

Full description

Bibliographic Details
Main Authors: Qi Tian, Xiaoliang Wang, Liping Pang, Mingkun Zhang, Fanyun Meng
Format: Article
Language:English
Published: MDPI AG 2021-06-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/9/12/1353
Description
Summary:Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper, a hybrid three-term conjugate gradient algorithm is proposed and it owns a sufficient descent property, independent of any line search technique. Under some mild conditions, the proposed method is globally convergent for uniformly convex objective functions. Meanwhile, by using the modified secant equation, the proposed method is also global convergence without convexity assumption on the objective function. Numerical results also indicate that the proposed algorithm is more efficient and reliable than the other methods for the testing problems.
ISSN:2227-7390