其他分享
首页 > 其他分享> > External-Attention-tensorflow(更新中)

External-Attention-tensorflow(更新中)

作者:互联网

External-Attention-tensorflow

1. Residual Attention Usage

1.1. Paper

Residual Attention: A Simple but Effective Method for Multi-Label Recognition---ICCV2021

1.2 Overview

image

1.3. UsageCode

from attention.ResidualAttention import ResidualAttention
import tensorflow as tf

input = tf.random.normal(shape=(50, 7, 7, 512))
resatt = ResidualAttention(num_class=1000, la=0.2)
output = resatt(input)
print(output.shape)

2. External Attention Usage

2.1. Paper

"Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks"

2.2. Overview

image

2.3. UsageCode

from attention.ExternalAttention import ExternalAttention
import tensorflow as tf

input = tf.random.normal(shape=(50, 49, 512))
ea = ExternalAttention(d_model=512, S=8)
output = ea(input)
print(output.shape)

3. Self Attention Usage

3.1. Paper

"Attention Is All You Need"

3.2. Overview

image

3.3. UsageCode

from attention.SelfAttention import ScaledDotProductAttention
import tensorflow as tf

input = tf.random.normal((50, 49, 512))
sa = ScaledDotProductAttention(d_model=512, d_k=512, d_v=512, h=8)
output = sa(input, input, input)
print(output.shape)

4. Simplified Self Attention Usage

4.1. Paper

None

4.2. Overview

image

4.3. UsageCode

from attention.SimplifiedSelfAttention import SimplifiedScaledDotProductAttention
import tensorflow as tf

input = tf.random.normal((50, 49, 512))
ssa = SimplifiedScaledDotProductAttention(d_model=512, h=8)
output = ssa(input, input, input)
print(output.shape)

5. Squeeze-and-Excitation Attention Usage

5.1. Paper

"Squeeze-and-Excitation Networks"

5.2. Overview

image

5.3. UsageCode

from attention.SEAttention import SEAttention
import tensorflow as tf

input = tf.random.normal((50, 7, 7, 512))
se = SEAttention(channel=512, reduction=8)
output = se(input)
print(output.shape)

6. SK Attention Usage

6.1. Paper

"Selective Kernel Networks"

6.2. Overview

image

6.3. UsageCode

from attention.SKAttention import SKAttention
import tensorflow as tf

input = tf.random.normal((50, 7, 7, 512))
se = SKAttention(channel=512, reduction=8)
output = se(input)
print(output.shape)

7. CBAM Attention Usage

7.1. Paper

"CBAM: Convolutional Block Attention Module"

7.2. Overview

image
image

7.3. Usage Code

from attention.CBAM import CBAMBlock
import torch

input=torch.randn(50,512,7,7)
kernel_size=input.shape[2]
cbam = CBAMBlock(channel=512,reduction=16,kernel_size=kernel_size)
output=cbam(input)
print(output.shape)

标签:input,import,Attention,External,output,tf,tensorflow,512
来源: https://www.cnblogs.com/ccfco/p/16386026.html