A Combined Conjugate Gradient Quasi-Newton Method with Modification BFGS Formula

Main Article Content

Mardeen Sh. Taher, Salah G. Shareef

Abstract

The conjugate gradient and Quasi-Newton methods have advantages and drawbacks, as although quasi-Newton algorithm has more rapid convergence than conjugate gradient, they require more storage compared to conjugate gradient algorithms. In 1976, Buckley designed a method that combines the CG method with QN updates, which is better than that observed for conjugate gradient algorithms but not as good as the quasi-Newton approach. This type of method is called the preconditioned conjugate gradient (PCG) method. In this paper, we introduce two new preconditioned conjugate gradient (PCG) methods that combine conjugate gradient with a new update of quasi-Newton methods. The new quasi-Newton method satisfied the positive define, and the direction of the new preconditioned conjugate gradient is descent direction. In numerical results, it is showing the new preconditioned conjugate gradient method is more effective on several high-dimension test problems than standard preconditioning.

Article Details

References

  1. N. Andrei, An Unconstrained Optimization Test Functions Collection, Adv. Model. Optim. 10 (2008), 147-161.
  2. M. Al-Baali, Descent Property and Global Convergence of the Fletcher-Reeves Method With Inexact Line Search, IMA J. Numer. Anal. 5 (1985), 121-124. https://doi.org/10.1093/imanum/5.1.121.
  3. C.g. Broyden, the Convergence of a Class of Double-Rank Minimization Algorithms 1. General Considerations, IMA J. Appl. Math. 6 (1970), 76-90. https://doi.org/10.1093/imamat/6.1.76.
  4. C.G. Broyden, J.E. Dennis Jr., J.J. More, On the Local and Superlinear Convergence of Quasi-Newton Methods, IMA J. Appl. Math. 12 (1973), 223-245. https://doi.org/10.1093/imamat/12.3.223.
  5. W.C. Davidon, Variable-Metric Method for Minimization, Technical Report, ANL-5990, (1959). https://doi.org/10.2172/4252678.
  6. R. Fletcher, Practical Methods of Optimization, John Wiley & Sons, (2013).
  7. R. Fletcher, A New Approach to Variable Metric Algorithms, Computer J. 13 (1970), 317-322. https://doi.org/10.1093/comjnl/13.3.317.
  8. J.D. Pearson, Variable Metric Methods of Minimisation, Computer J. 12 (1969), 171-178. https://doi.org/10.1093/comjnl/12.2.171.
  9. M.Sh. Taher, S.G. Shareef, A Modified Perry’s Conjugate Gradient Method Based on Powell’s Equation for Solving Large-Scale Unconstrained Optimization, Math. Stat. 9 (2021), 882-888. https://doi.org/10.13189/ms.2021.090603.
  10. M.J.D. Powell, A Fast Algorithm for Nonlinearly Constrained Optimization Calculations, in: G.A. Watson (Ed.), Numerical Analysis, Springer Berlin Heidelberg, Berlin, Heidelberg, 1978: pp. 144–157. https://doi.org/10.1007/BFb0067703.
  11. P. Wolfe, Convergence Conditions for Ascent Methods, SIAM Rev. 11 (1969), 226-235. https://doi.org/10.1137/1011036.
  12. M.R. Hestenes, E. Stiefel, Methods of Conjugate Gradients for Solving Linear Systems, J. Res. Nat. Bureau Standards. 49 (1952), 409-436.
  13. G. Zoutendijk, Nonlinear Programming Computational Methods, Integer Nonlinear program. (1970), 37-86. https://cir.nii.ac.jp/crid/1571980075701600256.
  14. E.D. Dolan, J.J. More, Benchmarking Optimization Software With Performance Profiles, Math. Program. 91 (2002), 201-213. https://doi.org/10.1007/s101070100263.