Depth with Nonlinearity Creates No Bad Local Minima in ResNets
作者:互联网
具有非线性的深度不会在ResNets中创建任何坏的局部最小值
Kenji Kawaguchi, Yoshua Bengio
7/9/2019(v1: 10/21/2018)stat.ML | cs.AI | cs.LG | math.OC
在本文中,我们证明了非线性度的深度在具有任意非线性激活函数的任意深度的ResNets中不会产生坏的局部最小值,在某种意义上来说,所有局部最小值的值都不会比相应经典机器的全局最小值差学习模型,并保证通过残差表示进一步改进。因此,本文为在2018年神经信息处理系统会议上的论文中提出的一个开放性问题提供了肯定的答案。本文仅针对ResNets而非其他网络体系结构提出了深度学习的优化理论。
英文简介:Depth with Nonlinearity Creates No Bad Local Minima in ResNets
In this paper, we prove that depth with nonlinearity creates no bad local minima in a type of arbitrarily deep ResNets with arbitrary nonlinear activation functions, in the sense that the values of all local minima are no worse than the global minimum value of corresponding classical machine-learning models, and are guaranteed to further improve via residual representations. As a result, this paper provides an affirmative answer to an open question stated in a paper in the conference on Neural Information Processing Systems 2018. This paper advances the optimization theory of deep learning only for ResNets and not for other network architectures.
标签:Minima,Nonlinearity,No,非线性,最小值,paper,ResNets,深度,2018 来源: https://www.cnblogs.com/cx2016/p/13814191.html