其他分享
首页 > 其他分享> > backward函数中gradient参数的一些理解

backward函数中gradient参数的一些理解

作者:互联网

当标量对向量求导时不需要该参数,但当向量对向量求导时,若不加上该参数则会报错,显示“grad can be implicitly created only for scalar outputs”,对该gradient参数解释如下。
当\(\mathbf y\)对\(\mathbf x\)求导时,结果为梯度矩阵,数学表达如下:

\[\frac{\partial \mathbf{y}}{\partial \mathbf{x}}=\left(\begin{array}{cccc} \frac{\partial y_{1}}{\partial x_{1}} & \frac{\partial y_{2}}{\partial x_{1}} & \cdots & \frac{\partial y_{m}}{\partial x_{1}} \\ \frac{\partial y_{1}}{\partial x_{2}} & \frac{\partial y_{2}}{\partial x_{2}} & \cdots & \frac{\partial y_{m}}{\partial x_{2}} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial y_{1}}{\partial x_{n}} & \frac{\partial y_{2}}{\partial x_{n}} & \cdots & \frac{\partial y_{m}}{\partial x_{n}} \end{array}\right) \]

当获取\(x\)的梯度时,x.grad = \(\frac{\partial \mathbf{y}}{\partial \mathbf{x}}gradient\)
具体例子可见该博客中的例子 https://blog.csdn.net/kuan__/article/details/108828003

标签:frac,函数,gradient,cdots,mathbf,partial,backward,vdots
来源: https://www.cnblogs.com/meitiandouyaokaixin/p/16339669.html