A counterexample regarding “New study on neural networks: The essential order of approximation”
“神经网络新研究:逼近的本质阶”的反例
作者: Steffen Goebbels
作者单位: 1Niederrhein University of Applied Sciences, Faculty of Electrical Engineering and Computer Science, Institute for Pattern Recognition, D-47805 Krefeld, Germany
刊名: Neural Networks, 2020, Vol.123 , pp.234-235
来源数据库: Elsevier Journal
DOI: 10.1016/j.neunet.2019.12.007
关键词: Neural networksSharpness of error boundsRates of convergence
原始语种摘要: Abstract(#br)The paper “New study on neural networks: the essential order of approximation” by Jianjun Wang and Zongben Xu, which appeared in Neural Networks 23 (2010), deals with upper and lower estimates for the error of best approximation with sums of nearly exponential type activation functions in terms of moduli of smoothness. In particular, the presented lower bound is astonishingly good. However, the proof is incorrect and the bound is wrong.
全文获取路径: Elsevier  (合作)
分享到:
来源刊物:
影响因子:1.927 (2012)

×
关键词翻译
关键词翻译
  • approximation 近似
  • neural 神经系统的
  • bounds 界限
  • smoothness 平滑度
  • exponential 指数的
  • essential 本质的
  • convergence 汇合
  • proof 证明
  • error 误差
  • study 学习