Two modified DY conjugate gradient methods for unconstrained optimization problems
作者: Zhibin ZhuDongdong ZhangShuo Wang
作者单位: 1School of Mathematics and Computing Science, Guangxi Colleges and Universities Key Laboratory of Data Analysis and Computation, Guilin University of Electronic Technology, Guilin 541004, China
2School of Electronic Engineering and Automation, Guilin University of Electronic Technology, Guilin 541004, China
刊名: Applied Mathematics and Computation, 2020, Vol.373
来源数据库: Elsevier Journal
DOI: 10.1016/j.amc.2019.125004
关键词: Unconstrained optimization problemConjugate gradient methodStandard Wolfe line searchGlobal convergenceSufficient descent property
原始语种摘要: Abstract(#br)In this paper, we study the unconstrained optimization problems, and the two modified DY conjugate gradient methods (DDY1 method and DDY2 method) are proposed under the DY conjugate gradient method. By using the standard Wolfe line search, we prove the global convergence of the two methods. The search direction of DDY1 method is descent with the standard Wolfe line search. The search direction generated by the DDY2 method is sufficient descent, in which the property does not depend on any line search. Preliminary numerical results show that the two methods are effective.
全文获取路径: Elsevier  (合作)
分享到:
来源刊物:
影响因子:1.349 (2012)

×
关键词翻译
关键词翻译
  • conjugate 共轭的
  • unconstrained 无约束
  • gradient 倾斜度
  • optimization 最佳化
  • convergence 汇合
  • modified 改进
  • descent 下降
  • method 方法
  • standard 标准
  • problem 题目