A modifications of conjugate gradient method for unconstrained optimization problems

Authors

  • Omar Alshorman

  • Mustafa Mamat

  • Ahmad Alhawarat

  • Mohd Revaie

How to Cite

Alshorman, O., Mamat, M., Alhawarat, A., & Revaie, M. (2018). A modifications of conjugate gradient method for unconstrained optimization problems. International Journal of Engineering and Technology, 7(2.14), 21-24. https://doi.org/10.14419/ijet.v7i2.14.11146

Received date: April 6, 2018

Accepted date: April 6, 2018

Published date: April 6, 2018

DOI:

https://doi.org/10.14419/ijet.v7i2.14.11146

Keywords:

Conjugate Gradient Parameter, Inexact Line Search, Strong Wolfe-Powell Line Search, Global Convergence, Unconstrained Optimization.

Abstract

The Conjugate Gradient (CG) methods play an important role in solving large-scale unconstrained optimization problems. Several studies have been recently devoted to improving and modifying these methods in relation to efficiency and robustness. In this paper, a new parameter of CG method has been proposed. The new parameter possesses global convergence properties under the Strong Wolfe-Powell (SWP) line search. The numerical results show that the proposed formula is more efficient and robust compared with Polak-Rribiere Ployak (PRP), Fletcher-Reeves (FR) and Wei, Yao, and Liu (WYL) parameters.

 

 

References

  1. [1] Al-Baali, M., “Descent property and global convergence of the Fletcher—Reeves method with inexact line searchâ€, IMA Journal of Numerical Analysis, Vol.5, No.1, (1985), pp.121-124.

    [2] Alhawarat, A., Zabidin, S., Mustafa, M., & Mohd R., “An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence propertiesâ€, Optimization Methods and Software, Vol. 32, No.6, (2017), pp. 1299-1312.

    [3] Alhawarat, A., Zabidin, S., “Modification of nonlinear conjugate gradient method with weak Wolfe-Powell line searchâ€, Abstract and Applied Analysis, Hindawi Publishing Corporation, Vol.2017, (2017).

    [4] Alhawarat, A., Mustafa, M., Mohd R., & Zabidin, S., “An efficient hybrid conjugate gradient method with the strong Wolfe-Powell line searchâ€, Mathematical Problems in Engineering , Vol.2015, (2015).

    [5] Alhawarat, A., Mustafa, M., Mohd R., & Ismail, M., “A new modification of nonlinear conjugate gradient coefficients with global convergence propertiesâ€, International Journal of Mathematical, Computational, Statistical, Natural and Physical Engineering, Vol.8, No.1, (2014), pp. 54-60.

    [6] Andrei, N., “An unconstrained optimization test functions collectionâ€, Adv. Model. Optim, Vol.10, No.1, (2008), pp. 147-161.

    [7] Dai, Y., Yaxiang, Y., “A nonlinear conjugate gradient method with a strong global convergence propertyâ€, SIAM Journal on Optimization, Vol.10, No.1, (1999), pp.177-182.

    [8] Dolan, E. D., Jorge, J. M., “Benchmarking optimization software with performance profilesâ€, Mathematical programming, Vol.91, No.2, (2002), pp. 201-213.

    [9] Fletcher, R., Colin, M. R., “Function minimization by conjugate gradientsâ€, The Computer Journal, Vol.7, No.2, (1964), PP.149-154.

    [10] Fletcher, R., Practical methods of optimization, John Wiley & Sons, (2013).

    [11] Gilbert, J., Jean, C., & Jorge, N., “Global convergence properties of conjugate gradient methods for optimizationâ€, SIAM Journal on optimization, Vol.2, No.1, (1992), pp.21-42.

    [12] Gould, N. I. M., Dominique, O., & Philippe, L. T., “CUTEr and SifDec: A constrained and unconstrained testing environment, revisitedâ€, ACM Transactions on Mathematical Software (TOMS), Vol.29, No.4, (2003), pp.373-394.

    [13] Hestenes, M. R., Eduard, S., Methods of conjugate gradients for solving linear systems, NBS, (1952).

    [14] Liu, Y., Storey, C., “Efficient generalized conjugate gradient algorithms, part 1: theoryâ€, Journal of Optimization Theory and Applications, Vol.69, No.1, (1991), pp.129-137.

    [15] Polak, E., Gerard, R., “Note sur la convergence de méthodes de directions conjuguées.†Revue françaised' informatiqueet de recherché pérationnelleâ€, Série rouge, Vol.3, No.16, (1969), pp.35-43.

    [16] Polyak, B. T., “The conjugate gradient method in extremal problemsâ€, USSR Computational Mathematics and Mathematical Physics, Vol.9, No.4, (1969), pp.94-112.

    [17] Powell, M. J. D., “Restart procedures for the conjugate gradient methodâ€, Mathematical programming, Vol.12, No.1, (1977), pp.241-254.

    [18] Salleh, Z., Ahmad, A., “An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart propertyâ€, Journal of Inequalities and Applications, Vol.2016, No.1, (2016), pp.110.

    [19] Wolfe, P., “Convergence conditions for ascent methodsâ€, SIAM review, Vol.11, No.2, (1969), pp.226-235.

    [20] Wei, Z., Guoyin, L., & Liqun, Q., “New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problemsâ€, Applied Mathematics and Computation, Vol.179, No.2, (2006), pp. 407-430.

    [21] Zhang, Y., Hao, Z., & Chuanlin, Z., “Global convergence of a modified PRP conjugate gradient methodâ€, Procedia Engineering, Vol.31, (2012), pp.986-995.

Downloads

How to Cite

Alshorman, O., Mamat, M., Alhawarat, A., & Revaie, M. (2018). A modifications of conjugate gradient method for unconstrained optimization problems. International Journal of Engineering and Technology, 7(2.14), 21-24. https://doi.org/10.14419/ijet.v7i2.14.11146

Received date: April 6, 2018

Accepted date: April 6, 2018

Published date: April 6, 2018