April  2005, 1(2): 193-200. doi: 10.3934/jimo.2005.1.193

Convergence property of the Fletcher-Reeves conjugate gradient method with errors

1. 

Dept. of Appl. Math., Dalian University of Technology, Dalian, Liaoning, 116024, China, China

Received  May 2004 Revised  October 2004 Published  April 2005

In this paper, we consider a new kind of Fletcher-Reeves (abbr. FR) conjugate gradient method with errors, which is broadly applied in neural network training. Its iterate formula is $x_{k+1}=x_{k}+\alpha_{k}(s_{k}+\omega_{k})$, where the main direction $s_{k}$ is obtained by FR conjugate gradient method and $\omega_{k}$ is accumulative error. The global convergence property of the method is proved under the mild assumption conditions.
Citation: C.Y. Wang, M.X. Li. Convergence property of the Fletcher-Reeves conjugate gradient method with errors. Journal of Industrial & Management Optimization, 2005, 1 (2) : 193-200. doi: 10.3934/jimo.2005.1.193
[1]

Gaohang Yu, Lutai Guan, Guoyin Li. Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Journal of Industrial & Management Optimization, 2008, 4 (3) : 565-579. doi: 10.3934/jimo.2008.4.565

[2]

Nam-Yong Lee, Bradley J. Lucier. Preconditioned conjugate gradient method for boundary artifact-free image deblurring. Inverse Problems & Imaging, 2016, 10 (1) : 195-225. doi: 10.3934/ipi.2016.10.195

[3]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[4]

El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial & Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883

[5]

Xing Li, Chungen Shen, Lei-Hong Zhang. A projected preconditioned conjugate gradient method for the linear response eigenvalue problem. Numerical Algebra, Control & Optimization, 2018, 8 (4) : 389-412. doi: 10.3934/naco.2018025

[6]

ShiChun Lv, Shou-Qiang Du. A new smoothing spectral conjugate gradient method for solving tensor complementarity problems. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021150

[7]

Nora Merabet. Global convergence of a memory gradient method with closed-form step size formula. Conference Publications, 2007, 2007 (Special) : 721-730. doi: 10.3934/proc.2007.2007.721

[8]

Stefan Kindermann. Convergence of the gradient method for ill-posed problems. Inverse Problems & Imaging, 2017, 11 (4) : 703-720. doi: 10.3934/ipi.2017033

[9]

Ya Li, ShouQiang Du, YuanYuan Chen. Modified spectral PRP conjugate gradient method for solving tensor eigenvalue complementarity problems. Journal of Industrial & Management Optimization, 2020  doi: 10.3934/jimo.2020147

[10]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[11]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149

[12]

Yu-Ning Yang, Su Zhang. On linear convergence of projected gradient method for a class of affine rank minimization problems. Journal of Industrial & Management Optimization, 2016, 12 (4) : 1507-1519. doi: 10.3934/jimo.2016.12.1507

[13]

Wen-ling Zhao, Dao-jin Song. A global error bound via the SQP method for constrained optimization problem. Journal of Industrial & Management Optimization, 2007, 3 (4) : 775-781. doi: 10.3934/jimo.2007.3.775

[14]

Haiyan Wang, Jinyan Fan. Convergence properties of inexact Levenberg-Marquardt method under Hölderian local error bound. Journal of Industrial & Management Optimization, 2021, 17 (4) : 2265-2275. doi: 10.3934/jimo.2020068

[15]

Zhili Ge, Gang Qian, Deren Han. Global convergence of an inexact operator splitting method for monotone variational inequalities. Journal of Industrial & Management Optimization, 2011, 7 (4) : 1013-1026. doi: 10.3934/jimo.2011.7.1013

[16]

Liyan Qi, Xiantao Xiao, Liwei Zhang. On the global convergence of a parameter-adjusting Levenberg-Marquardt method. Numerical Algebra, Control & Optimization, 2015, 5 (1) : 25-36. doi: 10.3934/naco.2015.5.25

[17]

Gonglin Yuan, Zhan Wang, Pengyuan Li. Global convergence of a modified Broyden family method for nonconvex functions. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021164

[18]

Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial & Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739

[19]

Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595

[20]

Yigui Ou, Haichan Lin. A class of accelerated conjugate-gradient-like methods based on a modified secant equation. Journal of Industrial & Management Optimization, 2020, 16 (3) : 1503-1518. doi: 10.3934/jimo.2019013

2020 Impact Factor: 1.801

Metrics

  • PDF downloads (82)
  • HTML views (0)
  • Cited by (4)

Other articles
by authors

[Back to Top]