braindecode.modules.TimeDistributed#
- class braindecode.modules.TimeDistributed(module)[source]#
Apply module on multiple windows.
Apply the provided module on a sequence of windows and return their concatenation. Useful with sequence-to-prediction models (e.g. sleep stager which must map a sequence of consecutive windows to the label of the middle window in the sequence).
- Parameters:
module (nn.Module) – Module to be applied to the input windows. Must accept an input of shape (batch_size, n_channels, n_times).
Examples
>>> import torch >>> from torch import nn >>> from braindecode.modules import TimeDistributed >>> module = TimeDistributed(nn.Conv1d(3, 4, kernel_size=3, padding=1)) >>> inputs = torch.randn(2, 5, 3, 20) >>> outputs = module(inputs) >>> outputs.shape torch.Size([2, 5, 4])
Methods
- forward(x)[source]#
- Parameters:
x (torch.Tensor) – Sequence of windows, of shape (batch_size, seq_len, n_channels, n_times).
- Returns:
Shape (batch_size, seq_len, output_size).
- Return type:
Examples using braindecode.modules.TimeDistributed#
Sleep staging on the Sleep Physionet dataset using Chambon2018 network
Sleep staging on the Sleep Physionet dataset using Chambon2018 network
Sleep staging on the Sleep Physionet dataset using Eldele2021
Sleep staging on the Sleep Physionet dataset using Eldele2021