Pytorch grad
作者:互联网
Gradient
Clarificaton
导数,derivate
偏微分,partial derivate
梯度,gradient
How to search for minima?
\[\begin{aligned} &\theta_{t+1} = \theta_{t} - \alpha_{t}\bigtriangledown{f(\theta_{t})}\\ &Function:\\ &J(\theta_1,\theta_2) = {\theta_1}^2 + {\theta_2}^2\\ &Objective:\\ &\substack{\min\\{\theta_1,\theta_2}} J(\theta_1,\theta_2)\\ &Update rules:\\ &\theta_1:=\theta_1 - \alpha \frac{d}{d\theta_1}J(\theta_1,\theta_2)\\ &\theta_2:=\theta_2 - \alpha \frac{d}{d\theta_2}J(\theta_1,\theta_2)\\ &Derivatives:\\ &\frac{d}{d\theta_1}J(\theta_1,\theta_2)=\frac{d}{d\theta_1}{\theta_1}^2 + \frac{d}{d\theta_1}{\theta_2}^2=2\theta_1\\ &\frac{d}{d\theta_2}J(\theta_1,\theta_2)=\frac{d}{d\theta_2}{\theta_1}^2 + \frac{d}{d\theta_2}{\theta_2}^2=2\theta_2\\ \end{aligned} \]Common Functions
Common Functions | Function | Derivative |
---|---|---|
Constant | $$C$$ | $$0$$ |
Line | $$x$$ | $$1$$ |
$$ax$$ | $$a$$ | |
Square | $$x^2$$ | $$2x$$ |
Square root | $$\sqrt{x}$$ | $$\frac{1}{2}x^{-\frac{1}{2}}$$ |
Exponential | $$e^x$$ | $$e^x$$ |
$$a^x$$ | $$\ln(a)a^x$$ | |
Logarithms | $$\ln(x)$$ | $$\frac{1}{x}$$ |
$$\log_a(x)$$ | $$\frac{1}{x\ln(a)}$$ | |
Trigonometry | $$\sin(x)$$ | $$\cos(x)$$ |
$$\cos(x)$$ | $$-\sin(x)$$ | |
$$\tan(x)$$ | $${\sec(x)}^2$$ | |
eg. |
标签:frac,bigtriangledown,Pytorch,delta,theta,aligned,grad,xw 来源: https://www.cnblogs.com/Carrawayang/p/15801189.html