Note
Go to the end to download the full example code.
Cross-session motor imagery with deep learning EEGNet v4 model#
This example shows how to use BrainDecode in combination with MOABB evaluation. In this example, we use the architecture EEGNetv4.
# Authors: Igor Carrara <igor.carrara@inria.fr>
# Bruno Aristimunha <b.aristimunha@gmail.com>
#
# License: BSD (3-clause)
import matplotlib.pyplot as plt
import mne
import seaborn as sns
import torch
from moabb.datasets import BNCI2014_001
from moabb.evaluations import CrossSessionEvaluation
from moabb.paradigms import MotorImagery
from moabb.utils import setup_seed
from sklearn.pipeline import make_pipeline
from skorch.callbacks import EarlyStopping, EpochScoring
from skorch.dataset import ValidSplit
from braindecode import EEGClassifier
from braindecode.models import EEGNetv4
mne.set_log_level(False)
# Print Information PyTorch
print(f"Torch Version: {torch.__version__}")
# Set up GPU if it is there
cuda = torch.cuda.is_available()
device = "cuda" if cuda else "cpu"
print("GPU is", "AVAILABLE" if cuda else "NOT AVAILABLE")
Torch Version: 2.8.0+cu128
GPU is NOT AVAILABLE
In this example, we will use only the dataset BNCI2014_001
.
Running the benchmark#
This example uses the CrossSession evaluation procedure. We focus on the dataset BNCI2014_001 and only on 1 subject to reduce computational time.
To keep the computational time low, the epoch is reduced. In a real situation, we suggest using the following: EPOCH = 1000 PATIENCE = 300
This code is implemented to run on the CPU. If you’re using a GPU, do not use multithreading (i.e. set n_jobs=1)
# Set random seed to be able to reproduce results
seed = 42
setup_seed(seed)
# Ensure that all operations are deterministic on GPU (if used) for reproducibility
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
# Hyperparameter
LEARNING_RATE = 0.0625 * 0.01 # parameter taken from Braindecode
WEIGHT_DECAY = 0 # parameter taken from Braindecode
BATCH_SIZE = 64 # parameter taken from BrainDecode
EPOCH = 10
PATIENCE = 3
fmin = 4
fmax = 100
tmin = 0
tmax = None
# Load the dataset
dataset = BNCI2014_001()
events = ["right_hand", "left_hand"]
paradigm = MotorImagery(
events=events, n_classes=len(events), fmin=fmin, fmax=fmax, tmin=tmin, tmax=tmax
)
subjects = [1]
X, _, _ = paradigm.get_data(dataset=dataset, subjects=subjects)
We try to set the tensorflow seeds, but it seems that tensorflow is not installed. Please refer to `https://www.tensorflow.org/` to install if you need to use this deep learning module.
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
Create Pipelines#
In order to create a pipeline, we need to load a model from braindecode.
the second step is to define a skorch model using EEGClassifier from braindecode
that allows converting the PyTorch model in a scikit-learn classifier.
Here, we will use the EEGNet v4 model [1] .
This model has mandatory hyperparameters (the number of channels, the number of classes,
and the temporal length of the input) but we do not need to specify them because they will
be set dynamically by EEGClassifier using the input data during the call to the .fit()
method.
# Define a Skorch classifier
clf = EEGClassifier(
module=EEGNetv4,
optimizer=torch.optim.Adam,
optimizer__lr=LEARNING_RATE,
batch_size=BATCH_SIZE,
max_epochs=EPOCH,
train_split=ValidSplit(0.2, random_state=seed),
device=device,
callbacks=[
EarlyStopping(monitor="valid_loss", patience=PATIENCE),
EpochScoring(
scoring="accuracy", on_train=True, name="train_acc", lower_is_better=False
),
EpochScoring(
scoring="accuracy", on_train=False, name="valid_acc", lower_is_better=False
),
],
verbose=1, # Not printing the results for each epoch
)
# Create the pipelines
pipes = {}
pipes["EEGNetV4"] = make_pipeline(clf)
Evaluation#
dataset.subject_list = dataset.subject_list[:2]
evaluation = CrossSessionEvaluation(
paradigm=paradigm,
datasets=dataset,
suffix="braindecode_example",
overwrite=True,
return_epochs=True,
n_jobs=1,
)
results = evaluation.process(pipes)
print(results.head())
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/analysis/results.py:93: RuntimeWarning: Setting non-standard config type: "MOABB_RESULTS"
set_config("MOABB_RESULTS", osp.join(osp.expanduser("~"), "mne_data"))
BNCI2014-001-CrossSession: 0%| | 0/2 [00:00<?, ?it/s]/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/paradigms/base.py:350: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
X = mne.concatenate_epochs(X)
epoch train_acc train_loss valid_acc valid_loss dur
------- ----------- ------------ ----------- ------------ ------
1 0.5000 0.7130 0.5862 0.6917 0.1813
2 0.5625 0.6884 0.5517 0.6916 0.1572
3 0.4219 0.7293 0.5172 0.6915 0.1572
4 0.4219 0.7269 0.5517 0.6915 0.1582
5 0.5000 0.7142 0.5517 0.6913 0.1567
6 0.5156 0.7064 0.5517 0.6913 0.1563
7 0.5469 0.6922 0.5172 0.6912 0.1556
8 0.5625 0.6918 0.5172 0.6912 0.1565
9 0.5156 0.6868 0.5172 0.6911 0.1582
10 0.5312 0.6969 0.5172 0.6910 0.1572
epoch train_acc train_loss valid_acc valid_loss dur
------- ----------- ------------ ----------- ------------ ------
1 0.4375 0.7023 0.5172 0.6926 0.1558
2 0.4375 0.7214 0.5172 0.6926 0.1562
3 0.5000 0.6940 0.5172 0.6926 0.1546
Stopping since valid_loss has not improved in the last 3 epochs.
BNCI2014-001-CrossSession: 50%|█████ | 1/2 [00:05<00:05, 5.31s/it]/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/urllib3/connectionpool.py:1064: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
warnings.warn(
0%| | 0.00/43.1M [00:00<?, ?B/s]
0%| | 8.19k/43.1M [00:00<08:49, 81.4kB/s]
0%| | 56.3k/43.1M [00:00<02:17, 312kB/s]
0%| | 88.1k/43.1M [00:00<02:18, 311kB/s]
1%|▏ | 216k/43.1M [00:00<01:02, 683kB/s]
1%|▍ | 465k/43.1M [00:00<00:32, 1.31MB/s]
2%|▊ | 961k/43.1M [00:00<00:16, 2.52MB/s]
5%|█▋ | 1.97M/43.1M [00:00<00:08, 4.91MB/s]
9%|███▍ | 3.97M/43.1M [00:00<00:04, 9.57MB/s]
16%|█████▊ | 6.75M/43.1M [00:00<00:02, 15.1MB/s]
21%|███████▋ | 9.00M/43.1M [00:01<00:01, 17.2MB/s]
26%|█████████▌ | 11.1M/43.1M [00:01<00:01, 18.3MB/s]
31%|███████████▌ | 13.5M/43.1M [00:01<00:01, 19.7MB/s]
37%|█████████████▌ | 15.8M/43.1M [00:01<00:01, 20.8MB/s]
42%|███████████████▌ | 18.1M/43.1M [00:01<00:01, 21.3MB/s]
47%|█████████████████▌ | 20.4M/43.1M [00:01<00:01, 21.5MB/s]
53%|███████████████████▌ | 22.8M/43.1M [00:01<00:00, 21.9MB/s]
58%|█████████████████████▌ | 25.1M/43.1M [00:01<00:00, 22.2MB/s]
65%|███████████████████████▉ | 27.9M/43.1M [00:01<00:00, 23.8MB/s]
71%|██████████████████████████▏ | 30.4M/43.1M [00:01<00:00, 24.1MB/s]
76%|████████████████████████████▎ | 32.9M/43.1M [00:02<00:00, 24.1MB/s]
82%|██████████████████████████████▎ | 35.3M/43.1M [00:02<00:00, 24.0MB/s]
88%|████████████████████████████████▌ | 37.8M/43.1M [00:02<00:00, 24.1MB/s]
94%|██████████████████████████████████▋ | 40.4M/43.1M [00:02<00:00, 24.2MB/s]
100%|████████████████████████████████████▉| 42.9M/43.1M [00:02<00:00, 24.5MB/s]
0%| | 0.00/43.1M [00:00<?, ?B/s]
100%|██████████████████████████████████████| 43.1M/43.1M [00:00<00:00, 210GB/s]
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/urllib3/connectionpool.py:1064: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
warnings.warn(
0%| | 0.00/44.2M [00:00<?, ?B/s]
0%| | 8.19k/44.2M [00:00<09:05, 81.1kB/s]
0%| | 56.3k/44.2M [00:00<02:21, 311kB/s]
0%|▏ | 144k/44.2M [00:00<01:18, 563kB/s]
1%|▎ | 329k/44.2M [00:00<00:41, 1.05MB/s]
2%|▌ | 688k/44.2M [00:00<00:22, 1.94MB/s]
3%|█ | 1.30M/44.2M [00:00<00:13, 3.30MB/s]
6%|██▏ | 2.62M/44.2M [00:00<00:06, 6.43MB/s]
11%|███▉ | 4.76M/44.2M [00:00<00:03, 11.1MB/s]
15%|█████▍ | 6.55M/44.2M [00:00<00:02, 13.1MB/s]
19%|███████▏ | 8.55M/44.2M [00:01<00:02, 15.1MB/s]
24%|████████▋ | 10.4M/44.2M [00:01<00:02, 16.1MB/s]
28%|██████████▎ | 12.4M/44.2M [00:01<00:01, 16.9MB/s]
32%|███████████▊ | 14.2M/44.2M [00:01<00:01, 17.2MB/s]
36%|█████████████▌ | 16.1M/44.2M [00:01<00:01, 17.8MB/s]
41%|███████████████ | 18.0M/44.2M [00:01<00:01, 17.9MB/s]
45%|████████████████▋ | 19.9M/44.2M [00:01<00:01, 18.2MB/s]
49%|██████████████████▏ | 21.8M/44.2M [00:01<00:01, 18.2MB/s]
54%|███████████████████▊ | 23.7M/44.2M [00:01<00:01, 18.3MB/s]
58%|█████████████████████▍ | 25.6M/44.2M [00:01<00:01, 18.4MB/s]
62%|██████████████████████▉ | 27.5M/44.2M [00:02<00:00, 18.2MB/s]
66%|████████████████████████▌ | 29.3M/44.2M [00:02<00:00, 18.2MB/s]
71%|██████████████████████████▎ | 31.4M/44.2M [00:02<00:00, 18.8MB/s]
75%|███████████████████████████▉ | 33.3M/44.2M [00:02<00:00, 18.8MB/s]
80%|█████████████████████████████▌ | 35.3M/44.2M [00:02<00:00, 19.0MB/s]
84%|███████████████████████████████▏ | 37.3M/44.2M [00:02<00:00, 19.1MB/s]
89%|████████████████████████████████▊ | 39.3M/44.2M [00:02<00:00, 19.1MB/s]
93%|██████████████████████████████████▌ | 41.2M/44.2M [00:02<00:00, 19.2MB/s]
98%|████████████████████████████████████▏| 43.2M/44.2M [00:02<00:00, 19.2MB/s]
0%| | 0.00/44.2M [00:00<?, ?B/s]
100%|██████████████████████████████████████| 44.2M/44.2M [00:00<00:00, 191GB/s]
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/datasets/preprocessing.py:278: UserWarning: warnEpochs <Epochs | 24 events (all good), 2 – 6 s (baseline off), ~4.1 MiB, data loaded,
'right_hand': 12
'left_hand': 12>
warn(f"warnEpochs {epochs}")
/opt/hostedtoolcache/Python/3.12.11/x64/lib/python3.12/site-packages/moabb/paradigms/base.py:350: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
X = mne.concatenate_epochs(X)
epoch train_acc train_loss valid_acc valid_loss dur
------- ----------- ------------ ----------- ------------ ------
1 0.4531 0.6903 0.5172 0.6928 0.1551
2 0.5469 0.6817 0.5172 0.6915 0.1557
3 0.4531 0.6995 0.5172 0.6903 0.1563
4 0.5781 0.6782 0.5172 0.6890 0.1604
5 0.6094 0.6852 0.5172 0.6879 0.1546
6 0.5312 0.6821 0.5172 0.6872 0.1547
7 0.5312 0.6955 0.5172 0.6865 0.1553
8 0.6094 0.6916 0.5172 0.6861 0.1585
9 0.5781 0.6712 0.5172 0.6858 0.1785
10 0.5625 0.6674 0.5172 0.6855 0.1612
epoch train_acc train_loss valid_acc valid_loss dur
------- ----------- ------------ ----------- ------------ ------
1 0.4688 0.7180 0.4483 0.7072 0.1538
2 0.4531 0.6965 0.4483 0.7062 0.1569
3 0.5625 0.6901 0.4483 0.7054 0.1555
4 0.4844 0.7067 0.4828 0.7049 0.1593
5 0.6875 0.6435 0.4828 0.7044 0.1566
6 0.5781 0.6685 0.4828 0.7041 0.1566
7 0.5625 0.6800 0.4828 0.7039 0.1536
8 0.5625 0.6592 0.4828 0.7037 0.1571
9 0.5156 0.6745 0.4828 0.7036 0.1573
10 0.5781 0.6549 0.4828 0.7037 0.1579
BNCI2014-001-CrossSession: 100%|██████████| 2/2 [00:18<00:00, 10.04s/it]
BNCI2014-001-CrossSession: 100%|██████████| 2/2 [00:18<00:00, 9.33s/it]
score time samples ... n_sessions dataset pipeline
0 0.509259 1.768249 144.0 ... 2 BNCI2014-001 EEGNetV4
1 0.527585 0.762250 144.0 ... 2 BNCI2014-001 EEGNetV4
2 0.554205 1.750530 144.0 ... 2 BNCI2014-001 EEGNetV4
3 0.546296 1.721474 144.0 ... 2 BNCI2014-001 EEGNetV4
[4 rows x 9 columns]
Plot Results#
plt.figure()
sns.barplot(data=results, y="score", x="subject", palette="viridis")
plt.show()

References#
Total running time of the script: (0 minutes 23.033 seconds)
Estimated memory usage: 908 MB