braindecode.models.EEGITNet#
- class braindecode.models.EEGITNet(n_outputs=None, n_chans=None, n_times=None, drop_prob=0.4, chs_info=None, input_window_seconds=None, sfreq=None, n_classes=None, in_channels=None, input_window_samples=None, add_log_softmax=True)[source]#
- EEG-ITNet: An Explainable Inception Temporal
Convolutional Network for motor imagery classification from Salami et. al 2022.
See [Salami2022] for details.
Code adapted from abbassalami/eeg-itnet
- Parameters:
n_outputs (int) – Number of outputs of the model. This is the number of classes in the case of classification.
n_chans (int) – Number of EEG channels.
n_times (int) – Number of time samples of the input window.
drop_prob (float) – Dropout probability.
chs_info (list of dict) – Information about each individual EEG channel. This should be filled with
info["chs"]
. Refer tomne.Info
for more details.input_window_seconds (float) – Length of the input window in seconds.
sfreq (float) – Sampling frequency of the EEG recordings.
n_classes (int) – Alias for n_outputs.
in_channels (int) – Alias for n_chans.
input_window_samples (int) – Alias for n_times.
add_log_softmax (bool) – Whether to use log-softmax non-linearity as the output function. LogSoftmax final layer will be removed in the future. Please adjust your loss function accordingly (e.g. CrossEntropyLoss)! Check the documentation of the torch.nn loss functions: https://pytorch.org/docs/stable/nn.html#loss-functions.
- Raises:
ValueError – If some input signal-related parameters are not specified: and can not be inferred.
FutureWarning – If add_log_softmax is True, since LogSoftmax final layer: will be removed in the future.
Notes
This implementation is not guaranteed to be correct, has not been checked by original authors, only reimplemented from the paper based on author implementation.
References
[Salami2022]Salami, J. Andreu-Perez and H. Gillmeister, “EEG-ITNet: An Explainable
Inception Temporal Convolutional Network for motor imagery classification,” in IEEE Access, doi: 10.1109/ACCESS.2022.3161489.