\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization

  • * Corresponding author: Yigui Ou

    * Corresponding author: Yigui Ou 

The work is supported by NNSF of China (No. 11261015) and NSF of Hainan Province (No. 2016CXTD004; No. 111001; No. 117011)

Abstract Full Text(HTML) Figure(2) / Table(4) Related Papers Cited by
  • This paper presents a nonmonotone scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving nonsmooth convex optimization problems, which combines the idea of scaled memoryless BFGS preconditioned conjugate gradient method with the nonmonotone technique and the Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under mild conditions, the global convergence of the proposed method is established. Preliminary numerical results and related comparisons show that the proposed method can be applied to solve large scale nonsmooth convex optimization problems.

    Mathematics Subject Classification: Primary: 90C25, 90C30; Secondary: 65K05.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  Performance profiles based on iterations (left) and function evaluations (right)

    Figure 2.  Performance profiles based on iterations (left) and function evaluations (right)

    Table 1.  Testing functions for small scale problems

    No.Functions$n$$x_0$${{f}_{ops}}\left( x \right)$
    1Rosenbrock2(-1.2; 1)0
    2Crescent2(-1.5; 2)0
    3CB22(1; -0.1)1.9522245
    4CB32(2; 2)2.0
    5DEM2(1; 1)-3
    6QL2(-1; 5)7.2
    7LQ2(-0.5; -0.5)-1.4142136
    8Mifflin 12(0.8; 0.6)-1.0
    9Mifflin 22(-1; -1)-1.0
    10Wolfe2(3; 2)-8
    11Rosen-Suzuki4(0; 0; 0; 0)-44
    12Shor5(0; 0; 0; 0; 1)22.600162
     | Show Table
    DownLoad: CSV

    Table 2.  Numerical results for five algorithms

    No.Algorithn 3.1LWTRYWBBSFTRRFBFGS
    13/4/6/13/54/56/48/49/4/5/
    2.6178e-90.2976e-73.4484e-77.1545e-46.2072e-10
    23/4/3/7/14/16/31/32/35/36/
    3.3026e-66.5430e-42.7450e-51.6000e-33.0590e-7
    34/5/5/12/13/15/54/55/5/6/
    1.95221.95221.95221.95731.9522
    42/3/6/13/4/8/55/562/3/
    2.00002.00002.00002.01002.0076
    53/4/8/16/4/7/5/6/3/4
    -3.0000-3.0000-3.0000-3.0000-3.0000
    611/12/4/9/22/25/48/49/10/11/
    7.20007.20007.20007.20037.2000
    73/4/5/10/6/7/3/4/2/3/
    -1.4142-1.4142-1.4142-1.4118-1.4033
    834/35/15/31/3/6/59/60/57/58/
    -1.0000-1.0000-0.9938-1.0000-1.0000
    92/3/8/16/12/13/4/5/2/3/
    -1.0000-1.0000-0.9999-0.9997-0.9813
    103/4/12/24/9/12/43/46/4/5/
    -8.0000-8.0000-8.0000-8.0000-8.0000
    1149/106/20/40/8/9/60/61/25/31/
    -43.9999-44.0000-43.9493-39.9924-43.9982
    1214/16/14/28/9/10/71/72/66/152/
    22.601922.600222.600422.689222.6017
     | Show Table
    DownLoad: CSV

    Table 3.  Testing functions for large scale problems

    No.FunctionsInitial points $x_0$
    1Generalization of MAXQ $(1, 2, \cdots, \frac{n}{2}, -\frac{n}{2}-1, \cdots, -n)$
    2Generalization of MXHILB $(1, 1, \cdots, 1)$
    3Chained LQ $(-0.5, -0.5, \cdots, -0.5)$
    4Number of active faces $(1, 1, \cdots, 1)$
    5Nonsmooth generalization of Brown 2 $(1, 0, 1, 0, \cdots)$
    6Chained Mifflin 2 $(-1, -1, \cdots, -1)$
    7Chained Crescent Ⅰ $(-1.5, 2, -1.5, 2, \cdots)$
    8Chained Crescent Ⅱ $(1, 0, 1, 0 \cdots)$
     | Show Table
    DownLoad: CSV

    Table 4.  results for Algorithms 3.1 and CG-YWL

    No.nAlgorithn 3.1CG-YWL
    11000186/1601/2.6568e-10225/4710/6.9354e-8
    5000242/2725/1.2183e-10250/5235/6.8798e-8
    10000253/2997/3.4045e-10261/5466/6.6528e-8
    2100094/1301/4.0582e-991/1482/8.2738e-9
    5000119/1499/2.6731e-9111/1938/9.7206e-9
    10000129/1901/2.1905e-9120/2127/5.8524e-9
    3100037/110/2.3278e-937/114/7.2687e-9
    500039/116/5.9987e-939/120/9.0932e-9
    1000040/121/1.6943e-940/123/9.0941e-9
    4100071/891/3.6735e-1177/1026/6.8037e-9
    500082/937/7.9864e-1190/1281/7.8405e-9
    1000086/1208/5.7163e-1196/1401/9.9366e-9
    5100035/114/1.5021e-1138/117/7.2687e-9
    500039/120/5.2352e-1140/123/9.0932e-9
    1000041/126/2.1136e-1141/125/1.8188e-8
    6100038/116/-5.3979e+337/114/-2.4975e+4
    500041/123/-3.2153e+439/120/-1.2498e+5
    1000043/129/-2.0127e+440/123/-2.4998e+5
    7100034/101/7.0463e-1137/114/5.4897e-9
    500036/113/3.8145e-1139/120/6.8294e-9
    1000039/120/4.3654e-1140/123/6.8253e-9
    8100037/112/6.0424e-1139/120/6.8185e-9
    500039/121/1.8205e-1141/126/8.5258e-9
    1000042/125/2.6473e-1142/129/8.5262e-9
     | Show Table
    DownLoad: CSV
  •   N. Andrei , Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007) , 401-416.  doi: 10.1007/s10589-007-9055-7.
      A. Auslender , Numerical methods for nondifferentiable convex optimization, Mathematical Programming Study, 30 (1987) , 102-126. 
      S. Babaie-Kafaki , A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, 4OR, A Quarterly Journal of Operations Research, 11 (2013) , 361-374.  doi: 10.1007/s10288-013-0233-4.
      S. Babaie-Kafaki  and  R. Chanbari , A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, Journal of Industrial and Management Optimization, 13 (2017) , 649-658.  doi: 10.3934/jimo.2016038.
      J. Barzilai  and  J. M. Borwein , Two-point stepsize gradient methods, IMA Journal of Numerical Analysis, 8 (1988) , 141-148. 
      E. Birgin  and  J. M. Martínez , A spectral conjugate gradient method for unconstrained optimization, Applied Mathematics and Optimization, 43 (2001) , 117-128.  doi: 10.1007/s00245-001-0003-0.
      J. F. Bonnans , J. C. Gilbert , C. Lemarechal  and  C. Sagastizabal , A family of variable-metric proximal methods, Mathematical Programming, 68 (1995) , 15-47.  doi: 10.1007/BF01585756.
      J. V. Burke  and  M. Qian , On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating, Mathematical Programming, 88 (2000) , 157-181.  doi: 10.1007/PL00011373.
      X. Chen  and  M. Fukushima , Proximal quasi-Newton methods for nondifferentiable convex optimization, Mathematical Programming, 85 (1999) , 313-334.  doi: 10.1007/s101070050059.
      E. D. Dolan  and  J. J. Moré , Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002) , 201-213.  doi: 10.1007/s101070100263.
      M. Fukushima , A descent algorithm for nonsmooth convex optimization, Mathematical Programming, 30 (1984) , 163-175.  doi: 10.1007/BF02591883.
      M. Fukushima  and  L. Q. Qi , A globally and superlinearly convergent algorithm for nonsmooth convex minimization, SIAM Journal on Optimization, 6 (1996) , 1106-1120.  doi: 10.1137/S1052623494278839.
      M. Haarala , K. Miettinen  and  M. M. Mäkelä , New limited memory bundle method for large-scale nonsmooth optimization, Optimization Methods and Software, 19 (2004) , 673-692.  doi: 10.1080/10556780410001689225.
      M. Haarala , K. Miettinen  and  M. M. Mäkelä , Globally convergent limited memory bundle method for large-scale nonsmooth optimization, Mathematical Programming, 109 (2007) , 181-205.  doi: 10.1007/s10107-006-0728-2.
      W. W. Hager  and  H. C. Zhang , A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006) , 35-58. 
      J. B. Hiriart-Urruty and C. Lemaréchal, Convex Analysis and Minimization Algorithms, Springer, Berlin, 1993. doi: 10.1007/978-3-662-02796-7.
      C. Lemarechal  and  C. Sagastizabal , Practical aspects of the Moreau-Yosida regularization, I: Theoretical preliminaries, SIAM Journal on Optimization, 7 (1997) , 367-385. 
      Q. Li , Conjugate gradient type methods for the nondifferentiable convex minimization, Optimization Letters, 7 (2013) , 533-545. 
      D. H. Li  and  M. Fukushima , A derivative-free line search and global convergence of Broyden-like method for nonlinear equations, Optimization Methods and Software, 13 (2000) , 181-201. 
      S. Lu , Z. X. Wei  and  L. Li , A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization, Computational Optimization and Applications, 51 (2012) , 551-573. 
      L. Luk$\check{s}$an and J. Vl$\check{c}$ek, Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization, Technical Report No. 798, Institute of Computer Science, Academy of Sciences of the Czech Republic, 2000.
      R. Mifflin , A quasi-second-order proximal bundle algorithm, Mathematical Programming, 73 (1996) , 51-72. 
      Y. G. Ou  and  H. C. Lin , An ODE-like nonmonotone method for nonsmooth convex optimization, Journal of Applied Mathematics and Computing, 52 (2016) , 265-285. 
      L. Q. Qi , Convergence analysis of some algorithms for solving nonsmooth equations, Mathematics of Operations Research, 18 (1993) , 227-244.  doi: 10.1287/moor.18.1.227.
      A. I. Rauf  and  M. Fukushima , A globally convergent BFGS method for nonsmooth convex optimization, Journal of Optimization Theory and Applications, 104 (2000) , 539-558. 
      N. Sagara  and  M. Fukushima , A trust region method for nonsmooth convex optimization, Journal of Industrial and Management Optimization, 1 (2005) , 171-180. 
      D. F. Shanno , On the convergence of a new conjugate gradient algorithm, SIAM Journal on Numerical Analysis, 15 (1978) , 1247-1257. 
      J. Shen, L. P. Pang and D. Li, An approximate quasi-Newton bundle-type method for nonsmooth optimization, Abstract and Applied Analysis, 2013, Art. ID 697474, 7 pp. doi: 10.1155/2013/697474.
      W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.
      G. L. Yuan , Z. H. Meng  and  Y. Li , A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016) , 129-152. 
      G. L. Yuan, Z. Sheng and W. J. Liu, The modified HZ conjugate gradient algorithm for large scale nonsmooth optimization Plos One, 11(2016), e0164289, 15pp. doi: 10.1371/journal.pone.0164289.
      G. L. Yuan  and  Z. X. Wei , The Barzilai and Borwein gradient method with nonmonotone line search for nonsmooth convex optimization problems, Mathematical Modelling and Analysis, 17 (2012) , 203-216. 
      G. L. Yuan , Z. X. Wei  and  G. Y. Li , A modified Polak-Ribiére-Polyak conjugate gradient algorithm for nonsmooth convex programs, Journal of Computational and Applied mathematics, 255 (2014) , 86-96. 
      G. L. Yuan , Z. X. Wei  and  Z. X. Wang , Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minization, Computational Optimization and Applications, 54 (2013) , 45-64. 
      H. C. Zhang  and  W. W. Hager , A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004) , 1043-1056.  doi: 10.1137/S1052623403428208.
      L. Zhang , W. J. Zhou  and  D. H. Li , A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006) , 629-640.  doi: 10.1093/imanum/drl016.
  • 加载中

Figures(2)

Tables(4)

SHARE

Article Metrics

HTML views(1415) PDF downloads(340) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return