• Previous Article
    Performance analysis and optimization research of multi-channel cognitive radio networks with a dynamic channel vacation scheme
  • JIMO Home
  • This Issue
  • Next Article
    A time-division distribution strategy for the two-echelon vehicle routing problem with demand blowout
doi: 10.3934/jimo.2021164
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Global convergence of a modified Broyden family method for nonconvex functions

College of Mathematics and Information Science, Guangxi University, Center for Applied Mathematics of Guangxi, Nanning, Guangxi 530004, China

*Corresponding author: Zhan Wang (zhanwxd@126.com)

Received  August 2020 Revised  April 2021 Early access September 2021

Fund Project: This work is supported by the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education (Grant No.[2019]52), the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003), and the special foundation for Guangxi Ba Gui Scholars

The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence $ \{B_k\} $ generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.

Citation: Gonglin Yuan, Zhan Wang, Pengyuan Li. Global convergence of a modified Broyden family method for nonconvex functions. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2021164
References:
[1]

M. Al-Baali and H. Khalfan, A combined class of self-scaling and modified quasi-Newton methods, Comput. Optim. Appl., 52 (2012), 393-408.  doi: 10.1007/s10589-011-9415-1.  Google Scholar

[2]

C. G. BroydenJ. E. Dennis and J. J. Mor$\acute{e}$, On the local and superlinear convergence of quasi-Newton methods, J. Inst. Math. Appl., 12 (1973), 223-245.  doi: 10.1093/imamat/12.3.223.  Google Scholar

[3]

R. ByrdJ. Nocedal and Y. Yuan, Global convergence of a class of quasi-Newton methods on convex problems, SIAM J. Numer. Anal., 24 (1987), 1171-1189.  doi: 10.1137/0724077.  Google Scholar

[4]

R. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal., 26 (1989), 727-739.  doi: 10.1137/0726042.  Google Scholar

[5]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[6]

L. C. W. Dixon, Variable metric algorithms: Nessary and sufficient conditions for identical behavior on nonquadratic functions, J. Optimiz. Theory. App., 10 (1972), 34-40.  doi: 10.1007/BF00934961.  Google Scholar

[7]

J. E. Dennis Jr and J. J. Mor$\acute{e}$, Quasi-Newton methods, motivation and theory, SIAM Rev., 19 (1977), 46-89.  doi: 10.1137/1019005.  Google Scholar

[8]

J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice Hall, Inc., Englewood Cliffs, NJ, 1983.  Google Scholar

[9]

Y. Dai, Convergence properties of the BFGS algoritm, SIAM J. Optim., 13 (2003), 693-701.  doi: 10.1137/S1052623401383455.  Google Scholar

[10]

R. Fletcher, Practical Methods of Optimization, 2$^nd$ edition, A Wiley-Interscience Publication. John Wiley & Sons, Ltd., Chichester, 1987.  Google Scholar

[11]

A. Griewank, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, Math. Program., 50 (1991), 141-175.  doi: 10.1007/BF01594933.  Google Scholar

[12]

A. Griewank, The "global" convergence of Broyden-like methods with suitable line search, J. Austral. Math. Soc. Ser., 28 (1986), 75-92.  doi: 10.1017/S0334270000005208.  Google Scholar

[13]

Z. W. Geem, Parameter estimation for the nonlinear Muskingum model using the BFGS technique, J. Irrig. Drain. Eng., 132 (2006), 474-478.  doi: 10.1061/(ASCE)0733-9437(2006)132:5(474).  Google Scholar

[14]

D. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.  Google Scholar

[15]

D. Li and M. Fukushima, On the global convergence of the BFGS method for nonconvex unconstrained optimization problems, SIAM J. Optim., 11 (2001), 1054-1064.  doi: 10.1137/S1052623499354242.  Google Scholar

[16]

A. OuyangL. LiuZ. Sheng and F. Wu, A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm, Math. Probl. Eng., 2015 (2015), 1-15.  doi: 10.1155/2015/573894.  Google Scholar

[17]

A. OuyangZ. TangK. LiA. Sallam and E. Sha, Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm, Int. J. Pattern. Recogn., 28 (2014), 1-29.  doi: 10.1142/S0218001414590034.  Google Scholar

[18]

M. J. D. Powell, On the convergence of the variable metric algorithm, J. Inst. Math. Appl., 7 (1971), 21-36.  doi: 10.1093/imamat/7.1.21.  Google Scholar

[19]

M. J. D. Powell, Some global convergence properties of a variable metric algorithm for minimization without exact line searches, Nonlinear Programming, SIAM-AMS Proc., Amer. Math. Soc., Providence, R. I., 9 (1976), 53–72.  Google Scholar

[20]

J. D. Pearson, Variable metric methods of minimization, Comput. J., 12 (1969/70), 171-178.  doi: 10.1093/comjnl/12.2.171.  Google Scholar

[21]

J. Schropp, A note on minimization problems and multistep methods, Numer. Math., 78 (1997), 87-101.  doi: 10.1007/s002110050305.  Google Scholar

[22]

J. Schropp, One-step and multistep procedures for constrained minimization problems, IMA J. Numer. Anal., 20 (2000), 135-152.  doi: 10.1093/imanum/20.1.135.  Google Scholar

[23]

P. L. Toint, Global convergence of the partitioned BFGS algorithm for convex partially separable optimization, Math. Program., 36 (1986), 290-306.  doi: 10.1007/BF02592063.  Google Scholar

[24]

D. J. Van Wyk, Differential optimization techniques, Appl. Math. Model., 8 (1984), 419-424.  doi: 10.1016/0307-904X(84)90048-9.  Google Scholar

[25]

M. N. VrahatisG. S. Androulakis and J. N. Lambrinos, A class of gradient unconstrained minimization algorithms with adaptive stepsize, J. Comput. Appl. Math., 114 (2000), 367-386.  doi: 10.1016/S0377-0427(99)00276-9.  Google Scholar

[26]

Z. WeiG. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.  doi: 10.1016/j.amc.2005.08.027.  Google Scholar

[27]

Z. WeiG. YuG. Yuan and Z. Lian, The superlinear convergence of a modified BFGS-type method for unconstrained optimization, Comput. Optim. Appl., 29 (2004), 315-332.  doi: 10.1023/B:COAP.0000044184.25410.39.  Google Scholar

[28]

H. YabeH. Ogasawara and M. Yoshino, Local and superlinear convergence of quasi-Newton methods based on modified secant conditions, J. Comput. Appl. Math., 205 (2007), 617-632.  doi: 10.1016/j.cam.2006.05.018.  Google Scholar

[29]

G. Yuan and X. Lu, A new line search method with trust region for unconstrained optimization, Comm. Appl. Nonlinear Anal., 15 (2008), 35-49.   Google Scholar

[30]

G. YuanZ. ShengB. WangW. Hu and C. Li, The global convergence of a modified BFGS method for nonconvex functions, J. Comput. Appl. Math., 327 (2018), 274-294.  doi: 10.1016/j.cam.2017.05.030.  Google Scholar

[31]

G. Yuan, Z. Wang and P. Li, A modifed Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems, Calcolo., 57 (2020), 21pp. doi: 10.1007/s10092-020-00383-5.  Google Scholar

[32]

G. YuanZ. Wei and X. Lu, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, Appl. Math. Model., 47 (2017), 811-825.  doi: 10.1016/j.apm.2017.02.008.  Google Scholar

[33]

G. Yuan and Z. Wei, Convergence analysis of a modified BFGS method on convex minimizations, Comput. Optim. Appl., 47 (2010), 237-255.  doi: 10.1007/s10589-008-9219-0.  Google Scholar

[34]

G. Yuan and Z. Wei, New line search methods for unconstrained optimization, J. Korean Statist. Soc., 38 (2009), 29-39.  doi: 10.1016/j.jkss.2008.05.004.  Google Scholar

[35]

G. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optim. Lett., 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.  Google Scholar

[36]

Y. Yuan and W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, 1999. Google Scholar

show all references

References:
[1]

M. Al-Baali and H. Khalfan, A combined class of self-scaling and modified quasi-Newton methods, Comput. Optim. Appl., 52 (2012), 393-408.  doi: 10.1007/s10589-011-9415-1.  Google Scholar

[2]

C. G. BroydenJ. E. Dennis and J. J. Mor$\acute{e}$, On the local and superlinear convergence of quasi-Newton methods, J. Inst. Math. Appl., 12 (1973), 223-245.  doi: 10.1093/imamat/12.3.223.  Google Scholar

[3]

R. ByrdJ. Nocedal and Y. Yuan, Global convergence of a class of quasi-Newton methods on convex problems, SIAM J. Numer. Anal., 24 (1987), 1171-1189.  doi: 10.1137/0724077.  Google Scholar

[4]

R. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal., 26 (1989), 727-739.  doi: 10.1137/0726042.  Google Scholar

[5]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[6]

L. C. W. Dixon, Variable metric algorithms: Nessary and sufficient conditions for identical behavior on nonquadratic functions, J. Optimiz. Theory. App., 10 (1972), 34-40.  doi: 10.1007/BF00934961.  Google Scholar

[7]

J. E. Dennis Jr and J. J. Mor$\acute{e}$, Quasi-Newton methods, motivation and theory, SIAM Rev., 19 (1977), 46-89.  doi: 10.1137/1019005.  Google Scholar

[8]

J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice Hall, Inc., Englewood Cliffs, NJ, 1983.  Google Scholar

[9]

Y. Dai, Convergence properties of the BFGS algoritm, SIAM J. Optim., 13 (2003), 693-701.  doi: 10.1137/S1052623401383455.  Google Scholar

[10]

R. Fletcher, Practical Methods of Optimization, 2$^nd$ edition, A Wiley-Interscience Publication. John Wiley & Sons, Ltd., Chichester, 1987.  Google Scholar

[11]

A. Griewank, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, Math. Program., 50 (1991), 141-175.  doi: 10.1007/BF01594933.  Google Scholar

[12]

A. Griewank, The "global" convergence of Broyden-like methods with suitable line search, J. Austral. Math. Soc. Ser., 28 (1986), 75-92.  doi: 10.1017/S0334270000005208.  Google Scholar

[13]

Z. W. Geem, Parameter estimation for the nonlinear Muskingum model using the BFGS technique, J. Irrig. Drain. Eng., 132 (2006), 474-478.  doi: 10.1061/(ASCE)0733-9437(2006)132:5(474).  Google Scholar

[14]

D. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.  Google Scholar

[15]

D. Li and M. Fukushima, On the global convergence of the BFGS method for nonconvex unconstrained optimization problems, SIAM J. Optim., 11 (2001), 1054-1064.  doi: 10.1137/S1052623499354242.  Google Scholar

[16]

A. OuyangL. LiuZ. Sheng and F. Wu, A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm, Math. Probl. Eng., 2015 (2015), 1-15.  doi: 10.1155/2015/573894.  Google Scholar

[17]

A. OuyangZ. TangK. LiA. Sallam and E. Sha, Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm, Int. J. Pattern. Recogn., 28 (2014), 1-29.  doi: 10.1142/S0218001414590034.  Google Scholar

[18]

M. J. D. Powell, On the convergence of the variable metric algorithm, J. Inst. Math. Appl., 7 (1971), 21-36.  doi: 10.1093/imamat/7.1.21.  Google Scholar

[19]

M. J. D. Powell, Some global convergence properties of a variable metric algorithm for minimization without exact line searches, Nonlinear Programming, SIAM-AMS Proc., Amer. Math. Soc., Providence, R. I., 9 (1976), 53–72.  Google Scholar

[20]

J. D. Pearson, Variable metric methods of minimization, Comput. J., 12 (1969/70), 171-178.  doi: 10.1093/comjnl/12.2.171.  Google Scholar

[21]

J. Schropp, A note on minimization problems and multistep methods, Numer. Math., 78 (1997), 87-101.  doi: 10.1007/s002110050305.  Google Scholar

[22]

J. Schropp, One-step and multistep procedures for constrained minimization problems, IMA J. Numer. Anal., 20 (2000), 135-152.  doi: 10.1093/imanum/20.1.135.  Google Scholar

[23]

P. L. Toint, Global convergence of the partitioned BFGS algorithm for convex partially separable optimization, Math. Program., 36 (1986), 290-306.  doi: 10.1007/BF02592063.  Google Scholar

[24]

D. J. Van Wyk, Differential optimization techniques, Appl. Math. Model., 8 (1984), 419-424.  doi: 10.1016/0307-904X(84)90048-9.  Google Scholar

[25]

M. N. VrahatisG. S. Androulakis and J. N. Lambrinos, A class of gradient unconstrained minimization algorithms with adaptive stepsize, J. Comput. Appl. Math., 114 (2000), 367-386.  doi: 10.1016/S0377-0427(99)00276-9.  Google Scholar

[26]

Z. WeiG. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.  doi: 10.1016/j.amc.2005.08.027.  Google Scholar

[27]

Z. WeiG. YuG. Yuan and Z. Lian, The superlinear convergence of a modified BFGS-type method for unconstrained optimization, Comput. Optim. Appl., 29 (2004), 315-332.  doi: 10.1023/B:COAP.0000044184.25410.39.  Google Scholar

[28]

H. YabeH. Ogasawara and M. Yoshino, Local and superlinear convergence of quasi-Newton methods based on modified secant conditions, J. Comput. Appl. Math., 205 (2007), 617-632.  doi: 10.1016/j.cam.2006.05.018.  Google Scholar

[29]

G. Yuan and X. Lu, A new line search method with trust region for unconstrained optimization, Comm. Appl. Nonlinear Anal., 15 (2008), 35-49.   Google Scholar

[30]

G. YuanZ. ShengB. WangW. Hu and C. Li, The global convergence of a modified BFGS method for nonconvex functions, J. Comput. Appl. Math., 327 (2018), 274-294.  doi: 10.1016/j.cam.2017.05.030.  Google Scholar

[31]

G. Yuan, Z. Wang and P. Li, A modifed Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems, Calcolo., 57 (2020), 21pp. doi: 10.1007/s10092-020-00383-5.  Google Scholar

[32]

G. YuanZ. Wei and X. Lu, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, Appl. Math. Model., 47 (2017), 811-825.  doi: 10.1016/j.apm.2017.02.008.  Google Scholar

[33]

G. Yuan and Z. Wei, Convergence analysis of a modified BFGS method on convex minimizations, Comput. Optim. Appl., 47 (2010), 237-255.  doi: 10.1007/s10589-008-9219-0.  Google Scholar

[34]

G. Yuan and Z. Wei, New line search methods for unconstrained optimization, J. Korean Statist. Soc., 38 (2009), 29-39.  doi: 10.1016/j.jkss.2008.05.004.  Google Scholar

[35]

G. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optim. Lett., 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.  Google Scholar

[36]

Y. Yuan and W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, 1999. Google Scholar

Figure 1.  Performance profiles of the algorithm MBroyden with different $ \phi $
Figure 2.  Performance profiles of these methods(CPU)
Figure 3.  Performance profiles of these methods(NI)
Figure 4.  Performance profiles of these methods(NFG)
Table 1.  Test problems
No. Test problems No. Test problems
1 Extended Freudenstein and Roth Function 38 ARWHEAD Function (CUTE)
2 Extended Trigonometric Function 39 ARWHEAD Function (CUTE)
3 Extended Rosenbrock Function 40 NONDQUAR Function (CUTE)
4 Extended White and Holst Function 41 DQDRTIC Function (CUTE)
5 Extended Beale Function 42 EG2 Function (CUTE)
6 Extended Penalty Function 43 DIXMAANA Function (CUTE)
7 Perturbed Quadratic Function 44 DIXMAANB Function (CUTE)
8 Raydan 1 Function 45 DIXMAANC Function (CUTE)
9 Raydan 2 Function 46 DIXMAANE Function (CUTE)
10 Diagonal 1 Function 47 Partial Perturbed Quadratic Function
11 Diagonal 2 Function 48 Broyden Tridiagonal Function
12 Diagonal 3 Function 49 Almost Perturbed Quadratic Function
13 Hager Function 50 Tridiagonal Perturbed Quadratic Function
14 Generalized Tridiagonal 1 Function 51 EDENSCH Function (CUTE)
15 Extended Tridiagonal 1 Function 52 VARDIM Function (CUTE)
16 Extended Three Exponential Terms Function 53 STAIRCASE S1 Function
17 Generalized Tridiagonal 2 Function 54 LIARWHD Function (CUTE)
18 Diagonal 4 Function 55 DIAGONAL 6 Function
19 Diagonal 5 Function 56 DIXON3DQ Function (CUTE)
20 Extended Himmelblau Function 57 DIXMAANF Function (CUTE)
21 Generalized PSC1 Function 58 DIXMAANG Function (CUTE)
22 Extended PSC1 Function 59 DIXMAANH Function (CUTE)
23 Extended Powell Function 60 DIXMAANI Function (CUTE)
24 Extended Block Diagonal BD1 Function 61 DIXMAANJ Function (CUTE)
25 Extended Maratos Function 62 DIXMAANK Function (CUTE)
26 Extended Cliff Function 63 DIXMAANL Function (CUTE)
27 Quadratic Diagonal Perturbed Function 64 DIXMAAND Function (CUTE)
28 Extended Wood Function 65 ENGVAL1 Function (CUTE)
29 Extended Hiebert Function 66 FLETCHCR Function (CUTE)
30 Quadratic Function QF1 Function 67 COSINE Function (CUTE)
31 Extended Quadratic Penalty QP1 Function 68 Extended DENSCHNB Function (CUTE)
32 Extended Quadratic Penalty QP2 Function 69 DENSCHNF Function (CUTE)
33 A Quadratic Function QF2 Function 70 SINQUAD Function (CUTE)
34 Extended EP1 Function 71 BIGGSB1 Function (CUTE)
35 Extended Tridiagonal-2 Function 72 Partial Perturbed Quadratic PPQ2 Function
36 BDQRTIC Function (CUTE) 73 Scaled Quadratic SQ1 Function
37 TRIDIA Function (CUTE)
No. Test problems No. Test problems
1 Extended Freudenstein and Roth Function 38 ARWHEAD Function (CUTE)
2 Extended Trigonometric Function 39 ARWHEAD Function (CUTE)
3 Extended Rosenbrock Function 40 NONDQUAR Function (CUTE)
4 Extended White and Holst Function 41 DQDRTIC Function (CUTE)
5 Extended Beale Function 42 EG2 Function (CUTE)
6 Extended Penalty Function 43 DIXMAANA Function (CUTE)
7 Perturbed Quadratic Function 44 DIXMAANB Function (CUTE)
8 Raydan 1 Function 45 DIXMAANC Function (CUTE)
9 Raydan 2 Function 46 DIXMAANE Function (CUTE)
10 Diagonal 1 Function 47 Partial Perturbed Quadratic Function
11 Diagonal 2 Function 48 Broyden Tridiagonal Function
12 Diagonal 3 Function 49 Almost Perturbed Quadratic Function
13 Hager Function 50 Tridiagonal Perturbed Quadratic Function
14 Generalized Tridiagonal 1 Function 51 EDENSCH Function (CUTE)
15 Extended Tridiagonal 1 Function 52 VARDIM Function (CUTE)
16 Extended Three Exponential Terms Function 53 STAIRCASE S1 Function
17 Generalized Tridiagonal 2 Function 54 LIARWHD Function (CUTE)
18 Diagonal 4 Function 55 DIAGONAL 6 Function
19 Diagonal 5 Function 56 DIXON3DQ Function (CUTE)
20 Extended Himmelblau Function 57 DIXMAANF Function (CUTE)
21 Generalized PSC1 Function 58 DIXMAANG Function (CUTE)
22 Extended PSC1 Function 59 DIXMAANH Function (CUTE)
23 Extended Powell Function 60 DIXMAANI Function (CUTE)
24 Extended Block Diagonal BD1 Function 61 DIXMAANJ Function (CUTE)
25 Extended Maratos Function 62 DIXMAANK Function (CUTE)
26 Extended Cliff Function 63 DIXMAANL Function (CUTE)
27 Quadratic Diagonal Perturbed Function 64 DIXMAAND Function (CUTE)
28 Extended Wood Function 65 ENGVAL1 Function (CUTE)
29 Extended Hiebert Function 66 FLETCHCR Function (CUTE)
30 Quadratic Function QF1 Function 67 COSINE Function (CUTE)
31 Extended Quadratic Penalty QP1 Function 68 Extended DENSCHNB Function (CUTE)
32 Extended Quadratic Penalty QP2 Function 69 DENSCHNF Function (CUTE)
33 A Quadratic Function QF2 Function 70 SINQUAD Function (CUTE)
34 Extended EP1 Function 71 BIGGSB1 Function (CUTE)
35 Extended Tridiagonal-2 Function 72 Partial Perturbed Quadratic PPQ2 Function
36 BDQRTIC Function (CUTE) 73 Scaled Quadratic SQ1 Function
37 TRIDIA Function (CUTE)
[1]

Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial & Management Optimization, 2019, 15 (4) : 1773-1793. doi: 10.3934/jimo.2018122

[2]

Yan Gu, Nobuo Yamashita. A proximal ADMM with the Broyden family for convex optimization problems. Journal of Industrial & Management Optimization, 2021, 17 (5) : 2715-2732. doi: 10.3934/jimo.2020091

[3]

Weijun Zhou, Youhua Zhou. On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 893-899. doi: 10.3934/jimo.2013.9.893

[4]

Jun Chen, Wenyu Sun, Zhenghao Yang. A non-monotone retrospective trust-region method for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 919-944. doi: 10.3934/jimo.2013.9.919

[5]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[6]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[7]

Zehui Jia, Xue Gao, Xingju Cai, Deren Han. The convergence rate analysis of the symmetric ADMM for the nonconvex separable optimization problems. Journal of Industrial & Management Optimization, 2021, 17 (4) : 1943-1971. doi: 10.3934/jimo.2020053

[8]

Hui Gao, Jian Lv, Xiaoliang Wang, Liping Pang. An alternating linearization bundle method for a class of nonconvex optimization problem with inexact information. Journal of Industrial & Management Optimization, 2021, 17 (2) : 805-825. doi: 10.3934/jimo.2019135

[9]

Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021143

[10]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[11]

Zhongliang Deng, Enwen Hu. Error minimization with global optimization for difference of convex functions. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1027-1033. doi: 10.3934/dcdss.2019070

[12]

Yang Li, Yonghong Ren, Yun Wang, Jian Gu. Convergence analysis of a nonlinear Lagrangian method for nonconvex semidefinite programming with subproblem inexactly solved. Journal of Industrial & Management Optimization, 2015, 11 (1) : 65-81. doi: 10.3934/jimo.2015.11.65

[13]

Sarra Delladji, Mohammed Belloufi, Badreddine Sellami. Behavior of the combination of PRP and HZ methods for unconstrained optimization. Numerical Algebra, Control & Optimization, 2021, 11 (3) : 377-389. doi: 10.3934/naco.2020032

[14]

Chien-Wen Chao, Shu-Cherng Fang, Ching-Jong Liao. A tropical cyclone-based method for global optimization. Journal of Industrial & Management Optimization, 2012, 8 (1) : 103-115. doi: 10.3934/jimo.2012.8.103

[15]

Chunlin Hao, Xinwei Liu. Global convergence of an SQP algorithm for nonlinear optimization with overdetermined constraints. Numerical Algebra, Control & Optimization, 2012, 2 (1) : 19-29. doi: 10.3934/naco.2012.2.19

[16]

Yafeng Li, Guo Sun, Yiju Wang. A smoothing Broyden-like method for polyhedral cone constrained eigenvalue problem. Numerical Algebra, Control & Optimization, 2011, 1 (3) : 529-537. doi: 10.3934/naco.2011.1.529

[17]

Zhili Ge, Gang Qian, Deren Han. Global convergence of an inexact operator splitting method for monotone variational inequalities. Journal of Industrial & Management Optimization, 2011, 7 (4) : 1013-1026. doi: 10.3934/jimo.2011.7.1013

[18]

Liyan Qi, Xiantao Xiao, Liwei Zhang. On the global convergence of a parameter-adjusting Levenberg-Marquardt method. Numerical Algebra, Control & Optimization, 2015, 5 (1) : 25-36. doi: 10.3934/naco.2015.5.25

[19]

Chunrong Chen, T. C. Edwin Cheng, Shengji Li, Xiaoqi Yang. Nonlinear augmented Lagrangian for nonconvex multiobjective optimization. Journal of Industrial & Management Optimization, 2011, 7 (1) : 157-174. doi: 10.3934/jimo.2011.7.157

[20]

Lixing Han. An unconstrained optimization approach for finding real eigenvalues of even order symmetric tensors. Numerical Algebra, Control & Optimization, 2013, 3 (3) : 583-599. doi: 10.3934/naco.2013.3.583

2020 Impact Factor: 1.801

Metrics

  • PDF downloads (95)
  • HTML views (142)
  • Cited by (0)

Other articles
by authors

[Back to Top]