其他分享
首页 > 其他分享> > Various Optimization Algorithms For Training Neural Network[转]

Various Optimization Algorithms For Training Neural Network[转]

作者:互联网

from

https://towardsdatascience.com/optimizers-for-training-neural-network-59450d71caf6

 

Optimizers help to get results faster

Gradient Descent

Stochastic Gradient Descent

Mini-Batch Gradient Descent

Momentum

Nesterov Accelerated Gradient

NAG vs momentum at local minima

Adagrad

A derivative of loss function for given parameters at a given time t.

Update parameters for given input i and at time/iteration t

AdaDelta

Update the parameters

Adam

First and second order of momentum

Update the parameters

Comparison between various optimizers

Comparison 1

comparison 2

Conclusions

 

标签:minima,Training,Network,parameters,Neural,gradient,Gradient,rate,learning
来源: https://www.cnblogs.com/lightsong/p/14643083.html