braindecode.models.TimeDistributed#

class braindecode.models.TimeDistributed(module)[source]#

Apply module on multiple windows.

Apply the provided module on a sequence of windows and return their concatenation. Useful with sequence-to-prediction models (e.g. sleep stager which must map a sequence of consecutive windows to the label of the middle window in the sequence).

Parameters:

module (nn.Module) – Module to be applied to the input windows. Must accept an input of shape (batch_size, n_channels, n_times).

Methods

forward(x)[source]#
Parameters:

x (torch.Tensor) – Sequence of windows, of shape (batch_size, seq_len, n_channels, n_times).

Returns:

Shape (batch_size, seq_len, output_size).

Return type:

torch.Tensor

Examples using braindecode.models.TimeDistributed#

Sleep staging on the Sleep Physionet dataset using Chambon2018 network

Sleep staging on the Sleep Physionet dataset using Chambon2018 network

Sleep staging on the Sleep Physionet dataset using Eldele2021

Sleep staging on the Sleep Physionet dataset using Eldele2021