braindecode.modules.ECA#

class braindecode.modules.ECA(in_channels, kernel_size)[source]#

Efficient Channel Attention [Wang2021].

Parameters:
  • in_channels (int) – number of input feature channels

  • kernel_size (int) – kernel size of convolutional layer, determines degree of channel interaction, must be odd.

Examples

>>> import torch
>>> from braindecode.modules import ECA
>>> module = ECA(in_channels=16, kernel_size=3)
>>> inputs = torch.randn(2, 16, 1, 64)
>>> outputs = module(inputs)
>>> outputs.shape
torch.Size([2, 16, 1, 64])

References

[Wang2021]

Wang, Q. et al., 2021. ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks. CVPR 2021.

Methods

forward(x)[source]#

Apply the Efficient Channel Attention block to the input tensor.

Parameters:

x (Pytorch.Tensor)

Return type:

Pytorch.Tensor