其他分享
首页 > 其他分享> > Layer_Norm

Layer_Norm

作者:互联网

Layer-Norm

代码示例

import torch
import torch.nn as nn


input = torch.tensor([[2., 2.],
                      [3., 3.]])
print(input)
"""
tensor([[2., 2.],
        [3., 3.]])
"""
layer_norm = nn.LayerNorm([2, 2])
output = layer_norm(input)

print(output)
"""
tensor([[-1.0000, -1.0000],
        [ 1.0000,  1.0000]], grad_fn=<NativeLayerNormBackward>)
"""
# 总结
"""
根据公式
E(x) = (2 + 2 + 3 + 3)/4 = 2.5
Var(x) = {(2-2.5)**2 + (2-2.5)**2 + (3-2.5)**2 + (3-2.5)**2} / 4 = 0.5**2
带入公式可以得到:
y = (x - E(x)) / (var(x)**0.5)
可以得到output
"""

标签:Layer,torch,tensor,nn,1.0000,Norm,input,2.5
来源: https://www.cnblogs.com/zranguai/p/15638554.html