braindecode.models.TimeDistributed

class braindecode.models.TimeDistributed(module)

Apply module on multiple windows.

Apply the provided module on a sequence of windows and return their concatenation. Useful with sequence-to-prediction models (e.g. sleep stager which must map a sequence of consecutive windows to the label of the middle window in the sequence).

Parameters
modulenn.Module

Module to be applied to the input windows. Must accept an input of shape (batch_size, n_channels, n_times).

Methods

forward(x)
Parameters
xtorch.Tensor

Sequence of windows, of shape (batch_size, seq_len, n_channels, n_times).

Returns
torch.Tensor

Shape (batch_size, seq_len, output_size).

Examples using braindecode.models.TimeDistributed