A New Modified Secant Condition for Non-linear Conjugate Gradient Methods with Global Convergence

Main Article Content

Farhan Khalaf Muord, Muna M. M. Ali

Abstract

The Conjugate Gradient Methods(CGM) are well-recognized techniques for handling nonlinear optimization problems. Dai and Liao (2001) employ the secant condition approach, this study utilizes the modified secant condition proposed by Yabe-Takano (2004) and Zhang and Xu (2001), which is satisfied at each iteration through the implementation of the strong Wolf-line search condition. Additionally, please provide three novel categories of conjugate gradient algorithms of this nature. We examined 15 well-known test functions. This novel approach utilises the existing gradient and function value to accurately approximate the goal function with high-order precision. The worldwide convergence of our novel algorithms is demonstrated under certain conditions. Numerical results are provided, and the efficiency is proven by comparing it to other approaches.

Article Details

Section
Articles