《吴恩达深度学习》学习笔记006_优化算法 (Optimization algorithms)
作者:互联网
http://www.ai-start.com/dl2017/html/lesson2-week2.html
优化算法 (Optimization algorithms)
Mini-batch 梯度下降(Mini-batch gradient descent)
理解mini-batch梯度下降法(Understanding mini-batch gradient descent)
指数加权平均数(Exponentially weighted averages)
理解指数加权平均数(Understanding exponentially weighted averages)
数加权平均的偏差修正(Bias correction in exponentially weighted averages)
动量梯度下降法(Gradient descent with Momentum)
RMSprop
root mean square prop算法
Adam 优化算法(Adam optimization algorithm)
Adam代表的是Adaptive Moment Estimation
Adam优化算法基本上就是将Momentum和RMSprop结合在一起。
学习率衰减(Learning rate decay)
局部最优的问题(The problem of local optima)
标签:吴恩达,weighted,descent,batch,算法,Optimization,algorithms,Adam,averages 来源: https://blog.csdn.net/qq_40376937/article/details/110131577