Braindecode — Decode raw EEG, ECoG and MEG with deep learning#

EEG · ECoG · MEG · EMG · iEEG  ·  Open source since 2017

Decode raw brain signals with deep learning.

A PyTorch-native toolbox for end-to-end neural decoding. Load a dataset, pick a published model, train. Same scikit-learn API you already know.

For neuroscientists who want to work with deep learning, and deep learning researchers who want to work with neurophysiological data.

v1.5 · BSD-3-Clause Python 3.11+ Works with MNE-Python
2017First release
80+Contributors
PyTorch+ scikit-learn · + MNE
Built & maintained at
UC San Diego Uni Freiburg Inria Donders Institute
+ 80 community contributors
65+
Published model architectures
20+
EEG-specific augmentations
150+
Public datasets via MOABB
700+
BIDS datasets via EEGDash
Quickstart

Train a model in 15 lines.

From raw motor-imagery EEG to a trained classifier. Each step is a stand-alone module; swap any piece without touching the others.

  • 01 Load the dataset Pull BCI Competition IV-2a via MOABB.
  • 02 Preprocess & window Bandpass-filter, then cut motor-imagery trials.
  • 03 Pick a model EEGNeX with 22 channels, 4 classes, 4.5 s @ 250 Hz.
  • 04 Train skorch wrapper with sklearn-style fit().
train_eegnex.py
PyTorch · CUDA
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
# 1. Load BCI Competition IV-2a (subject 3) via MOABB
from braindecode.datasets import MOABBDataset
dataset = MOABBDataset("BNCI2014_001", subject_ids=[3])
# 2. Bandpass-filter and create event-aligned windows
from braindecode.preprocessing import (
Preprocessor, preprocess, create_windows_from_events,
)
preprocess(dataset, [Preprocessor("filter", l_freq=4., h_freq=38.)])
windows = create_windows_from_events(dataset)
# 3. Instantiate a published architecture
from braindecode.models import EEGNeX
model = EEGNeX(n_chans=22, n_outputs=4, n_times=1125)
# 4. Train with the skorch-based EEGClassifier
from braindecode import EEGClassifier
clf = EEGClassifier(module=model, max_epochs=10)
clf.fit(windows, y=None)
Trains on CPU or GPU · Swap the dataset, model and trainer freely; same API. Open the full tutorial →
Model zoo

A library of published architectures.

Every model is reproduced from its original paper and ships with a consistent constructor. The shared EEGModuleMixin takes any of n_chans, n_outputs, n_times, chs_info, input_window_seconds or sfreq; the missing ones are derived for you.

braindecode + Hugging Face

Hugging Face Hub

Pretrained, on the Hub.

Every Braindecode model implements from_pretrained and push_to_hub, the same unified API as transformers. Models serialize as a config-driven JSON + safetensors pair (diffable, audit-friendly), and datasets ride on Zarr so push / pull stream chunks lazily without re-uploading the full archive.

Models

Curated foundation weights

BENDR, EEGPT, Signal-JEPA, LaBraM, BIOT and more. Pretrained checkpoints stored as safetensors alongside a readable JSON config, so fine-tuning is reproducible and weight provenance stays auditable. Curated and benchmarked under OpenEEG-Bench.

model = EEGPT.from_pretrained("braindecode/eegpt") Visit huggingface.co/braindecode →
Datasets

Share & pull EEG datasets

Push WindowsDataset, EEGWindowsDataset or RawDataset objects to the Hub with one call. Storage is Zarr-backed, so subsequent pulls only fetch the chunks you read. Useful for multi-GB sessions over flaky links. EEGDash mirrors 700+ BIDS-ready EEG/MEG datasets the same way.

ds.push_to_hub("alice/bnci2014_001") Browse EEGDash datasets →

Ready to brain decode something?

Install in one line, follow a tutorial in fifteen, publish an experiment by Friday.