其他分享
首页 > 其他分享> > 代码笔记17 Pytorch中的requires_grad_()与requires_grad的区别

代码笔记17 Pytorch中的requires_grad_()与requires_grad的区别

作者:互联网

问题

  感谢pycharm,我还不知道有一天我会发现这种问题,我本来是查看一下batchnorm2d中tensor的requires_grad属性,然后我就顺着快捷栏点下来的。结果发现requires_grad_()与requires_grad完全不一样。

代码

requires_grad

 for m in net.modules():
      if isinstance(m,nn.BatchNorm2d):
          print(m,m.weight.requires_grad,m.bias.requires_grad)

requires_grad代码显示为

BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) False False
BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) False False
BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) True True
BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) True True

requires_grad_()

 for m in net.modules():
      if isinstance(m,nn.BatchNorm2d):
          print(m,m.weight.requires_grad_(),m.bias.requires_grad_())

此时代码显示为每个tensor的值与requires_grad属性

tensor([2.3888e-01, 2.9136e-01, 3.1615e-01, 2.7122e-01, 2.1731e-01, 3.0903e-01,
        2.2937e-01, 2.3086e-01, 2.1129e-01, 2.8054e-01, 1.9923e-01, 3.1894e-01,
        1.7955e-01, 1.1246e-08, 1.9704e-01, 2.0996e-01, 2.4317e-01, 2.1697e-01,
        1.9415e-01, 3.1569e-01, 1.9648e-01, 2.3214e-01, 2.1962e-01, 2.1633e-01,
        2.4357e-01, 2.9683e-01, 2.3852e-01, 2.1162e-01, 1.4492e-01, 2.9388e-01,
        2.2911e-01, 9.2716e-02, 4.3334e-01, 2.0782e-01, 2.7990e-01, 3.5804e-01,
        2.9315e-01, 2.5306e-01, 2.4210e-01, 2.1755e-01, 3.8645e-01, 2.1003e-01,
        3.6805e-01, 3.3724e-01, 5.0826e-01, 1.9341e-01, 2.3914e-01, 2.6652e-01,
        3.9020e-01, 1.9840e-01, 2.1694e-01, 2.6666e-01, 4.9806e-01, 2.3553e-01,
        2.1349e-01, 2.5951e-01, 2.3547e-01, 1.7579e-01, 4.5354e-01, 1.7102e-01,
        2.4903e-01, 2.5148e-01, 3.8020e-01, 1.9665e-01], requires_grad=True)

解决

然后我去把requires_grad_()的源码以及说明找了看

    def requires_grad_(self: T, requires_grad: bool = True) -> T:
        r"""Change if autograd should record operations on parameters in this
        module.

        This method sets the parameters' :attr:`requires_grad` attributes
        in-place.

        This method is helpful for freezing part of the module for finetuning
        or training parts of a model individually (e.g., GAN training).

        See :ref:`locally-disable-grad-doc` for a comparison between
        `.requires_grad_()` and several similar mechanisms that may be confused with it.

        Args:
            requires_grad (bool): whether autograd should record operations on
                                  parameters in this module. Default: ``True``.

        Returns:
            Module: self
        """
        for p in self.parameters():
            p.requires_grad_(requires_grad)
        return self

行吧,这原来是个函数,大概就是将tensor的requires_grad属性调整为True

那么requires_grad呢

这是一个定义在tensor类里的属性

标签:BatchNorm2d,tensor,17,True,requires,01,grad
来源: https://www.cnblogs.com/HumbleHater/p/16391935.html