Cross-session motor imagery with deep learning EEGNet v4 model#

This example shows how to use BrainDecode in combination with MOABB evaluation. In this example, we use the architecture EEGNet.

# Authors: Igor Carrara <igor.carrara@inria.fr>
#          Bruno Aristimunha <b.aristimunha@gmail.com>
#
# License: BSD (3-clause)

import matplotlib.pyplot as plt
import mne
import seaborn as sns
import torch
from moabb.datasets import BNCI2014_001
from moabb.evaluations import CrossSessionEvaluation
from moabb.paradigms import MotorImagery
from moabb.utils import setup_seed
from sklearn.pipeline import make_pipeline
from skorch.callbacks import EarlyStopping, EpochScoring
from skorch.dataset import ValidSplit

from braindecode import EEGClassifier
from braindecode.models import EEGNet

mne.set_log_level(False)

# Print Information PyTorch
print(f"Torch Version: {torch.__version__}")

# Set up GPU if it is there
cuda = torch.cuda.is_available()
device = "cuda" if cuda else "cpu"
print("GPU is", "AVAILABLE" if cuda else "NOT AVAILABLE")
Torch Version: 2.8.0+cu128
GPU is NOT AVAILABLE

In this example, we will use only the dataset BNCI2014_001.

Running the benchmark#

This example uses the CrossSession evaluation procedure. We focus on the dataset BNCI2014_001 and only on 1 subject to reduce computational time.

To keep the computational time low, the epoch is reduced. In a real situation, we suggest using the following: EPOCH = 1000 PATIENCE = 300

This code is implemented to run on the CPU. If you’re using a GPU, do not use multithreading (i.e. set n_jobs=1)

# Set random seed to be able to reproduce results
seed = 42
setup_seed(seed)

# Ensure that all operations are deterministic on GPU (if used) for reproducibility
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False

# Hyperparameter
LEARNING_RATE = 0.0625 * 0.01  # parameter taken from Braindecode
WEIGHT_DECAY = 0  # parameter taken from Braindecode
BATCH_SIZE = 64  # parameter taken from BrainDecode
EPOCH = 10
PATIENCE = 3
fmin = 4
fmax = 100
tmin = 0
tmax = None

# Load the dataset
dataset = BNCI2014_001()
events = ["right_hand", "left_hand"]
paradigm = MotorImagery(
    events=events, n_classes=len(events), fmin=fmin, fmax=fmax, tmin=tmin, tmax=tmax
)
subjects = [1]
X, _, _ = paradigm.get_data(dataset=dataset, subjects=subjects)
We try to set the tensorflow seeds, but it seems that tensorflow is not installed. Please refer to `https://www.tensorflow.org/` to install if you need to use this deep learning module.
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")

Create Pipelines#

In order to create a pipeline, we need to load a model from braindecode. the second step is to define a skorch model using EEGClassifier from braindecode that allows converting the PyTorch model in a scikit-learn classifier. Here, we will use the EEGNet v4 model [1] . This model has mandatory hyperparameters (the number of channels, the number of classes, and the temporal length of the input) but we do not need to specify them because they will be set dynamically by EEGClassifier using the input data during the call to the .fit() method.

# Define a Skorch classifier
clf = EEGClassifier(
    module=EEGNet,
    optimizer=torch.optim.Adam,
    optimizer__lr=LEARNING_RATE,
    batch_size=BATCH_SIZE,
    max_epochs=EPOCH,
    train_split=ValidSplit(0.2, random_state=seed),
    device=device,
    callbacks=[
        EarlyStopping(monitor="valid_loss", patience=PATIENCE),
        EpochScoring(
            scoring="accuracy", on_train=True, name="train_acc", lower_is_better=False
        ),
        EpochScoring(
            scoring="accuracy", on_train=False, name="valid_acc", lower_is_better=False
        ),
    ],
    verbose=1,  # Not printing the results for each epoch
)

# Create the pipelines
pipes = {}
pipes["EEGNet"] = make_pipeline(clf)

Evaluation#

dataset.subject_list = dataset.subject_list[:2]

evaluation = CrossSessionEvaluation(
    paradigm=paradigm,
    datasets=dataset,
    suffix="braindecode_example",
    overwrite=True,
    return_epochs=True,
    n_jobs=1,
)

results = evaluation.process(pipes)

print(results.head())
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/analysis/results.py:93: RuntimeWarning: Setting non-standard config type: "MOABB_RESULTS"
  set_config("MOABB_RESULTS", osp.join(osp.expanduser("~"), "mne_data"))

BNCI2014-001-CrossSession:   0%|          | 0/2 [00:00<?, ?it/s]/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/paradigms/base.py:350: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5000        0.7130       0.5862        0.6917  0.1904
      2       0.5625        0.6884       0.5517        0.6916  0.1724
      3       0.4219        0.7293       0.5172        0.6915  0.1713
      4       0.4219        0.7269       0.5517        0.6915  0.1699
      5       0.5000        0.7142       0.5517        0.6913  0.1710
      6       0.5156        0.7064       0.5517        0.6913  0.1698
      7       0.5469        0.6922       0.5172        0.6912  0.1716
      8       0.5625        0.6918       0.5172        0.6912  0.1741
      9       0.5156        0.6868       0.5172        0.6911  0.1720
     10       0.5312        0.6969       0.5172        0.6910  0.1712
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4375        0.7023       0.5172        0.6926  0.1729
      2       0.4375        0.7214       0.5172        0.6926  0.1707
      3       0.5000        0.6940       0.5172        0.6926  0.1734
Stopping since valid_loss has not improved in the last 3 epochs.

BNCI2014-001-CrossSession:  50%|█████     | 1/2 [00:05<00:05,  5.44s/it]/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
 'right_hand': 12
 'left_hand': 12>
  warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/paradigms/base.py:350: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4531        0.6903       0.5172        0.6928  0.1744
      2       0.5469        0.6817       0.5172        0.6915  0.1713
      3       0.4531        0.6995       0.5172        0.6903  0.1709
      4       0.5781        0.6782       0.5172        0.6890  0.1699
      5       0.6094        0.6852       0.5172        0.6879  0.1731
      6       0.5312        0.6821       0.5172        0.6872  0.1724
      7       0.5312        0.6955       0.5172        0.6865  0.1709
      8       0.6094        0.6916       0.5172        0.6861  0.1706
      9       0.5781        0.6712       0.5172        0.6858  0.1714
     10       0.5625        0.6674       0.5172        0.6855  0.1700
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4688        0.7180       0.4483        0.7072  0.1727
      2       0.4531        0.6965       0.4483        0.7062  0.1709
      3       0.5625        0.6901       0.4483        0.7054  0.1698
      4       0.4844        0.7067       0.4828        0.7049  0.1712
      5       0.6875        0.6435       0.4828        0.7044  0.1699
      6       0.5781        0.6685       0.4828        0.7041  0.1706
      7       0.5625        0.6800       0.4828        0.7039  0.1702
      8       0.5625        0.6592       0.4828        0.7037  0.1686
      9       0.5156        0.6745       0.4828        0.7036  0.1692
     10       0.5781        0.6549       0.4828        0.7037  0.1732

BNCI2014-001-CrossSession: 100%|██████████| 2/2 [00:11<00:00,  5.99s/it]
BNCI2014-001-CrossSession: 100%|██████████| 2/2 [00:11<00:00,  5.91s/it]
      score      time  samples  ... n_sessions       dataset  pipeline
0  0.509259  1.929963    144.0  ...          2  BNCI2014-001    EEGNet
1  0.527585  0.853690    144.0  ...          2  BNCI2014-001    EEGNet
2  0.554205  1.899107    144.0  ...          2  BNCI2014-001    EEGNet
3  0.546296  1.887717    144.0  ...          2  BNCI2014-001    EEGNet

[4 rows x 9 columns]

Plot Results#

plt.figure()
sns.barplot(data=results, y="score", x="subject", palette="viridis")
plt.show()
plot moabb benchmark

References#

Total running time of the script: (0 minutes 15.709 seconds)

Estimated memory usage: 1283 MB

Gallery generated by Sphinx-Gallery