Return to search

A Comparative Study of Non Linear Conjugate Gradient Methods

We study the development of nonlinear conjugate gradient methods, Fletcher Reeves (FR) and Polak Ribiere (PR). FR extends the linear conjugate gradient method to nonlinear functions by incorporating two changes, for the step length αk a line search is performed and replacing the residual, rk (rk=b-Axk) by the gradient of the nonlinear objective function. The PR method is equivalent to FR method for exact line searches and when the underlying quadratic function is strongly convex. The PR method is basically a variant of FR and primarily differs from it in the choice of the parameter βk. On applying the nonlinear Rosenbrock function to the MATLAB code for the FR and the PR algorithms we observe that the performance of PR method (k=29) is far better than the FR method (k=42). But, we observe that when the MATLAB codes are applied to general nonlinear functions, specifically functions whose minimum is a large negative number not close to zero and the iterates too are large values far off from zero the PR algorithm does not perform well. This problem with the PR method persists even if we run the PR algorithm for more iterations or with an initial guess closer to the actual minimum. To improve the PR algorithm we suggest finding a better weighing parameter βk, using better line search method and/or using specific line search for certain functions and identifying specific restart criteria based on the function to be optimized.

Identiferoai:union.ndltd.org:unt.edu/info:ark/67531/metadc283864
Date08 1900
CreatorsPathak, Subrat
ContributorsLiu, Jianguo, Iaia, Joseph, Song, Kai-Sheng
PublisherUniversity of North Texas
Source SetsUniversity of North Texas
LanguageEnglish
Detected LanguageEnglish
TypeThesis or Dissertation
FormatText
RightsPublic, Pathak, Subrat, Copyright, Copyright is held by the author, unless otherwise noted. All rights Reserved.

Page generated in 0.0023 seconds