Section Navigation
Convolutional Block Attention Module from [Woo2018].
in_channels (int) – number of input feature channels
reduction_rate (int) – reduction ratio of the fully-connected layers
kernel_size (int) – kernel size of the convolutional layer
References
Woo, S., Park, J., Lee, J., Kweon, I., 2018.
CBAM: Convolutional Block Attention Module. ECCV 2018.
Methods
Apply the Convolutional Block Attention Module to the input tensor.
x (Pytorch.Tensor)
Pytorch.Tensor
CBAM
CBAM.forward()