其他分享
首页 > 其他分享> > Pytorch grad

Pytorch grad

作者:互联网

Gradient

Clarificaton
导数,derivate
偏微分,partial derivate
梯度,gradient

\[\begin{aligned} &\bigtriangledown{f} = (\frac{\delta{f}}{\delta{x_1}};\frac{\delta{f}}{\delta{x_2}};...;\frac{\delta{f}}{\delta{x_n}})\\ &z = y^2-x^2\\ &\frac{\delta{z}}{\delta{x}}=-2x\\ &\frac{\delta{z}}{\delta{y}}=2y\\ &\bigtriangledown{f} = (-2x,2y)\\ \end{aligned} \]

How to search for minima?

\[\begin{aligned} &\theta_{t+1} = \theta_{t} - \alpha_{t}\bigtriangledown{f(\theta_{t})}\\ &Function:\\ &J(\theta_1,\theta_2) = {\theta_1}^2 + {\theta_2}^2\\ &Objective:\\ &\substack{\min\\{\theta_1,\theta_2}} J(\theta_1,\theta_2)\\ &Update rules:\\ &\theta_1:=\theta_1 - \alpha \frac{d}{d\theta_1}J(\theta_1,\theta_2)\\ &\theta_2:=\theta_2 - \alpha \frac{d}{d\theta_2}J(\theta_1,\theta_2)\\ &Derivatives:\\ &\frac{d}{d\theta_1}J(\theta_1,\theta_2)=\frac{d}{d\theta_1}{\theta_1}^2 + \frac{d}{d\theta_1}{\theta_2}^2=2\theta_1\\ &\frac{d}{d\theta_2}J(\theta_1,\theta_2)=\frac{d}{d\theta_2}{\theta_1}^2 + \frac{d}{d\theta_2}{\theta_2}^2=2\theta_2\\ \end{aligned} \]

Common Functions
Common Functions Function Derivative
Constant $$C$$ $$0$$
Line $$x$$ $$1$$
$$ax$$ $$a$$
Square $$x^2$$ $$2x$$
Square root $$\sqrt{x}$$ $$\frac{1}{2}x^{-\frac{1}{2}}$$
Exponential $$e^x$$ $$e^x$$
$$a^x$$ $$\ln(a)a^x$$
Logarithms $$\ln(x)$$ $$\frac{1}{x}$$
$$\log_a(x)$$ $$\frac{1}{x\ln(a)}$$
Trigonometry $$\sin(x)$$ $$\cos(x)$$
$$\cos(x)$$ $$-\sin(x)$$
$$\tan(x)$$ $${\sec(x)}^2$$
eg.

\[\begin{aligned} &f=[y-(xw+b)]^2\\ &g=xw+b\\ &\frac{\delta f}{\delta w}=2(y-g)\frac{\delta g}{\delta w}=2(y-(xw+b))*(-x)\\ &\frac{\delta f}{\delta b}=2(y-g)\frac{\delta g}{\delta b}=2(y-(xw+b))*(-1)\\ &\bigtriangledown=(2(y-(xw+b))*(-x),2(y-(xw+b))*(-1))\\ \end{aligned} \]

标签:frac,bigtriangledown,Pytorch,delta,theta,aligned,grad,xw
来源: https://www.cnblogs.com/Carrawayang/p/15801189.html