Braindece API Reference#

Models#

Model zoo availables in braindecode. The models are implemented as PyTorch torch.nn.Modules and can be used for various EEG decoding ways tasks.

All the models have the convention of having the signal related parameters named the same way, following the braindecode’s standards:

  • n_outputs: Number of labels or outputs of the model.

  • n_chans: Number of EEG channels.

  • n_times: Number of time points of the input window.

  • sfreq: Sampling frequency of the EEG recordings.

  • (/ ) input_window_seconds: Length of the input window in seconds.

  • chs_info: Information about each individual EEG channel. Refer to mne.Info["chs"].

All the models assume that the input data is a 3D tensor of shape (batch_size, n_chans, n_times), and some models also accept a 4D tensor of shape (batch_size, n_chans, n_times, n_epochs), in case of cropped model.

All the models are implemented as subclasses of EEGModuleMixin, which is a base class for all EEG models in Braindecode. The EEGModuleMixin class provides a common interface for all EEG models and derivate variables names if necessary.

Also, all models inherit from PyTorchModelHubMixin, which provides functionality to save and load models from the Hugging Face Hub, if the braindecode[hug] package is installed.

note:

Auto-generated Pydantic configs are available when the optional braindecode[pydantic] extra (which installs pydantic and numpydantic) is installed; otherwise config generation is skipped.

braindecode.models.base:

EEGModuleMixin([n_outputs, n_chans, ...])

Mixin class for all EEG models in braindecode.

braindecode.models:

ATCNet([n_chans, n_outputs, ...])

ATCNet from Altaheri et al. (2022) [R2ecdb73d6ab9-1].

AttentionBaseNet([n_times, n_chans, ...])

AttentionBaseNet from Wimpff M et al. (2023) [R523d6c831d64-Martin2023].

AttnSleep([sfreq, n_tce, d_model, d_ff, ...])

Sleep Staging Architecture from Eldele et al. (2021) [Re40f69d9f16e-Eldele2021].

BDTCN([n_chans, n_outputs, chs_info, ...])

Braindecode TCN from Gemein, L et al (2020) [Rd56781dc6fcb-gemein2020].

BIOT([embed_dim, num_heads, num_layers, ...])

BIOT from Yang et al. (2023) [R606e26b38fe6-Yang2023].

BENDR([n_chans, n_outputs, n_times, ...])

BENDR (BErt-inspired Neural Data Representations) from Kostas et al. (2021) [R26ccbd70a49a-bendr].

ContraWR([n_chans, n_outputs, sfreq, ...])

Contrast with the World Representation ContraWR from Yang et al (2021) [Ra71465cb6797-Yang2021].

CTNet([n_outputs, n_chans, sfreq, chs_info, ...])

CTNet from Zhao, W et al (2024) [Rc7f1d6cec70c-ctnet].

Deep4Net([n_chans, n_outputs, n_times, ...])

Deep ConvNet model from Schirrmeister et al (2017) [Rb8ef6c2733ce-Schirrmeister2017].

DeepSleepNet([n_outputs, return_feats, ...])

DeepSleepNet from Supratak et al. (2017) [R28b528aca953-Supratak2017].

EEGConformer([n_outputs, n_chans, ...])

EEG Conformer from Song et al. (2022) [Rd6c0fefc356a-song2022].

EEGInceptionERP([n_chans, n_outputs, ...])

EEG Inception for ERP-based from Santamaria-Vazquez et al (2020) [R37c4761d4e92-santamaria2020].

EEGInceptionMI([n_chans, n_outputs, ...])

EEG Inception for Motor Imagery, as proposed in Zhang et al. (2021) [Rc36ed781f4f5-1].

EEGITNet([n_outputs, n_chans, n_times, ...])

EEG-ITNet from Salami, et al (2022) [R7fe571f46200-Salami2022]

EEGMiner([method, n_chans, n_outputs, ...])

EEGMiner from Ludwig et al (2024) [R66a8789ab6ed-eegminer].

EEGNet([n_chans, n_outputs, n_times, ...])

EEGNet model from Lawhern et al. (2018) [Rffa56cc934a8-Lawhern2018].

EEGNeX([n_chans, n_outputs, n_times, ...])

EEGNeX model from Chen et al. (2024) [R60f8df15fd80-eegnex].

EEGSimpleConv([n_outputs, n_chans, sfreq, ...])

EEGSimpleConv from Ouahidi, YE et al. (2023) [R5661533ddc63-Yassine2023].

EEGSym([n_chans, n_outputs, n_times, ...])

EEGSym from Pérez-Velasco et al (2022) [R871bea9be1d1-eegsym2022].

EEGTCNet([n_chans, n_outputs, n_times, ...])

EEGTCNet model from Ingolfsson et al. (2020) [Rc25b3d8a3a40-ingolfsson2020].

FBCNet([n_chans, n_outputs, chs_info, ...])

FBCNet from Mane, R et al (2021) [R9769c9f8e3f7-fbcnet2021].

FBLightConvNet([n_chans, n_outputs, ...])

LightConvNet from Ma, X et al (2023) [R501137d6e8c9-lightconvnet].

FBMSNet([n_chans, n_outputs, chs_info, ...])

FBMSNet from Liu et al (2022) [Re7850041dabd-fbmsnet].

IFNet([n_chans, n_outputs, n_times, ...])

IFNetV2 from Wang J et al (2023) [Rd9f3b242e751-ifnet].

Labram([n_times, n_outputs, chs_info, ...])

Labram from Jiang, W B et al (2024) [Rb5cdfc6ea4fe-Jiang2024].

LUNA([n_outputs, n_chans, n_times, sfreq, ...])

LUNA from Döner et al. [Ra888573a1c66-LUNA].

MEDFormer([n_chans, n_outputs, n_times, ...])

Medformer from Wang et al. (2024) [Rf62d33a1206f-Medformer2024].

MSVTNet([n_chans, n_outputs, n_times, ...])

MSVTNet model from Liu K et al (2024) from [R0733e66fed6d-msvt2024].

PBT([n_chans, n_outputs, n_times, chs_info, ...])

Patched Brain Transformer (PBT) model from Klein et al. (2025) [Re7f840f86627-pbt].

SCCNet([n_chans, n_outputs, n_times, ...])

SCCNet from Wei, C S (2019) [Rbd95e5cdbbde-sccnet].

ShallowFBCSPNet([n_chans, n_outputs, ...])

Shallow ConvNet model from Schirrmeister et al (2017) [R9432a19f6121-Schirrmeister2017].

SignalJEPA([n_outputs, n_chans, chs_info, ...])

Architecture introduced in signal-JEPA for self-supervised pre-training, Guetschel, P et al (2024) [R62b31c4b9b52-1]

SignalJEPA_Contextual([n_outputs, n_chans, ...])

Contextual downstream architecture introduced in signal-JEPA Guetschel, P et al (2024) [Rb25bb9e753a8-1].

SignalJEPA_PostLocal([n_outputs, n_chans, ...])

Post-local downstream architecture introduced in signal-JEPA Guetschel, P et al (2024) [R29e8e87440e5-1].

SignalJEPA_PreLocal([n_outputs, n_chans, ...])

Pre-local downstream architecture introduced in signal-JEPA Guetschel, P et al (2024) [R795e75e58da6-1].

SincShallowNet([num_time_filters, ...])

Sinc-ShallowNet from Borra, D et al (2020) [R4fd1ba6a7153-borra2020].

SleepStagerBlanco2020([n_chans, sfreq, ...])

Sleep staging architecture from Blanco et al. (2020) from [Rb3eee9d9e81a-Blanco2020].

SleepStagerChambon2018([n_chans, sfreq, ...])

Sleep staging architecture from Chambon et al. (2018) [R89163c5eab6a-Chambon2018].

SSTDPN([n_chans, n_times, n_outputs, ...])

SSTDPN from Can Han et al (2025) [R15c5a176a51e-Han2025].

SPARCNet([n_chans, n_times, n_outputs, ...])

Seizures, Periodic and Rhythmic pattern Continuum Neural Network (SPaRCNet) from Jing et al. (2023) [Rf8eed20f8ca2-jing2023].

SyncNet([n_chans, n_times, n_outputs, ...])

Synchronization Network (SyncNet) from Li, Y et al (2017) [R5cdbe961b734-Li2017].

TIDNet([n_chans, n_outputs, n_times, ...])

Thinker Invariance DenseNet model from Kostas et al. (2020) [Re74dd80418c9-TIDNet].

TSception([n_chans, n_outputs, ...])

TSception model from Ding et al. (2020) from [R576865d14b94-ding2020].

USleep([n_chans, sfreq, depth, ...])

Sleep staging architecture from Perslev et al. (2021) [R58a5e8182f0a-1].

Modules#

braindecode.modules:

This module contains the building blocks for Braindecode models. It contains activation functions, convolutional layers, attention mechanisms, filter banks, and other utilities.

Activation#

These modules wrap specialized activation functions—e.g., safe logarithms for numerical stability.

braindecode.modules.activation:

LogActivation([epsilon])

Logarithm activation function.

SafeLog([epsilon])

Safe logarithm activation function module.

Attention#

These modules implement various attention mechanisms, including multi’head attention and squeeze and excitation layers.

braindecode.modules.attention:

CAT(in_channels, reduction_rate, kernel_size)

Attention Mechanism from [R4b3f3c7c2fe2-Wu2023].

CBAM(in_channels, reduction_rate, kernel_size)

Convolutional Block Attention Module from [R8389a3af090c-Woo2018].

ECA(in_channels, kernel_size)

Efficient Channel Attention [R18399df6eede-Wang2021].

FCA(in_channels[, seq_len, reduction_rate, ...])

Frequency Channel Attention Networks from [R7a6c40ff4f0b-Qin2021].

GCT(in_channels)

Gated Channel Transformation from [R977e9a24fcb7-Yang2020].

SRM(in_channels[, use_mlp, reduction_rate, bias])

Attention module from [R94e1b7c716d1-Lee2019].

CATLite(in_channels, reduction_rate[, bias])

Modification of CAT without the convolutional layer from [R1d30135f2545-Wu2023].

EncNet(in_channels, n_codewords)

Context Encoding for Semantic Segmentation from [R13609999c674-Zhang2018].

GatherExcite(in_channels[, seq_len, ...])

Gather-Excite Networks from [Re3d24ebdda8b-Hu2018b].

GSoP(in_channels, reduction_rate[, bias])

Global Second-order Pooling Convolutional Networks from [Reb98cd67024f-Gao2018].

MultiHeadAttention(emb_size, num_heads, dropout)

SqueezeAndExcitation(in_channels, reduction_rate)

Squeeze-and-Excitation Networks from [R0b6baaf6da0f-Hu2018].

Blocks#

These modules are specialized building blocks for neural networks, including multi’layer perceptrons (MLPs) and inception blocks.

braindecode.modules.blocks:

MLP(in_features[, hidden_features, ...])

Multilayer Perceptron (MLP) with GELU activation and optional dropout.

FeedForwardBlock(emb_size, expansion, drop_p)

InceptionBlock(branches)

Inception block module.

Convolution#

These modules implement constraints convolutional layers, including depthwise convolutions and causal convolutions. They also include convolutional layers with constraints and pooling layers.

braindecode.modules.convolution:

AvgPool2dWithConv(kernel_size, stride[, ...])

Compute average pooling using a convolution, to have the dilation parameter.

CausalConv1d(in_channels, out_channels, ...)

Causal 1-dimensional convolution

CombinedConv(in_chans[, n_filters_time, ...])

Merged convolutional layer for temporal and spatial convs in Deep4/ShallowFBCSP

Conv2dWithConstraint(*args[, max_norm])

DepthwiseConv2d(in_channels[, ...])

Depthwise convolution layer.

Filter#

These modules implement Filter Bank as Layer and generalizer Gaussian layer.

braindecode.modules.filter:

FilterBankLayer(n_chans, sfreq[, ...])

Apply multiple band-pass filters to generate multiview signal representation.

GeneralizedGaussianFilter(in_channels, ...)

Generalized Gaussian Filter from Ludwig et al (2024) [Raf65b68c9f5f-eegminer].

Layers#

These modules implement various types of layers, including dropout layers, normalization layers, and time’distributed layers. They also include layers for handling different input shapes and dimensions.

braindecode.modules.layers:

Chomp1d(chomp_size)

DropPath([drop_prob])

Drop paths, also known as Stochastic Depth, per sample.

Ensure4d(*args, **kwargs)

TimeDistributed(module)

Apply module on multiple windows.

Linear#

These modules implement linear layers with various constraints and initializations. They include linear layers with max’norm constraints and linear layers with specific initializations.

braindecode.modules.linear:

LinearWithConstraint(*args[, max_norm])

Linear layer with max-norm constraint on the weights.

MaxNormLinear(in_features, out_features[, ...])

Linear layer with MaxNorm constraining on weights.

Stats#

These modules implement statistical layers, including layers for calculating the mean, standard deviation, and variance of input data. They also include layers for calculating the log power and log variance of input data. Mostly used on FilterBank models.

braindecode.modules.stats:

StatLayer(stat_fn, dim[, keepdim, ...])

Generic layer to compute a statistical function along a specified dimension.

LogPowerLayer

Generic layer to compute a statistical function along a specified dimension.

LogVarLayer

Generic layer to compute a statistical function along a specified dimension.

MaxLayer

Generic layer to compute a statistical function along a specified dimension.

MeanLayer

Generic layer to compute a statistical function along a specified dimension.

StdLayer

Generic layer to compute a statistical function along a specified dimension.

VarLayer

Generic layer to compute a statistical function along a specified dimension.

Utilities#

These modules implement various utility functions and classes for change to cropped model.

braindecode.modules.util:

aggregate_probas(logits[, n_windows_stride])

Aggregate predicted probabilities with self-ensembling.

Wrappers#

These modules implement wrappers for various types of models, including wrappers for models with multiple outputs and wrappers for models with intermediate outputs.

braindecode.modules.wrapper:

Expression(expression_fn)

Compute given expression on forward pass.

IntermediateOutputWrapper(to_select, model)

Wraps network model such that outputs of intermediate layers can be returned.

Functional#

braindecode.functional:

The functional module contains various functions that can be used like functional API.

drop_path(x[, drop_prob, training, ...])

Drop paths (Stochastic Depth) per sample.

glorot_weight_zero_bias(model)

Initialize parameters of all modules by initializing weights with glorot uniform/xavier initialization, and setting biases to zero.

hilbert_freq(x[, forward_fourier])

Compute the Hilbert transform using PyTorch, separating the real and imaginary parts.

identity(x)

plv_time(x[, forward_fourier, epsilon])

Compute the Phase Locking Value (PLV) metric in the time domain.

rescale_parameter(param, layer_id)

Recaling the l-th transformer layer.

safe_log(x[, eps])

Prevents \(log(0)\) by using \(log(max(x, eps))\).

square(x)

Datasets#

braindecode.datasets:

Pytorch Datasets structure for common EEG datasets, and function to create the dataset from several different data formats. The options available are: Numpy Arrays, MNE Raw and MNE Epochs.

Base classes#

BaseConcatDataset(list_of_ds[, target_transform])

A base class for concatenated datasets.

RecordDataset([description, transform])

RawDataset(raw[, description, target_name, ...])

Returns samples from an mne.io.Raw object along with a target.

WindowsDataset(windows[, description, ...])

Returns windows from an mne.Epochs object along with a target.

BIDSDataset(root[, subjects, sessions, ...])

Dataset for loading BIDS.

BIDSEpochsDataset(*args, **kwargs)

Experimental dataset for loading mne.Epochs organised in BIDS.

Common Datasets#

BCICompetitionIVDataset4([subject_ids])

BCI competition IV dataset 4.

BNCI2014_001(subject_ids)

BNCI 2014-001 Motor Imagery dataset.

CHBMIT([root])

The Children's Hospital Boston EEG Dataset.

HGD(subject_ids)

High-gamma dataset described in Schirrmeister et al. 2017.

MOABBDataset(dataset_name[, subject_ids, ...])

A class for moabb datasets.

NMT([path, target_name, recording_ids, ...])

The NMT Scalp EEG Dataset.

SleepPhysionet([subject_ids, recording_ids, ...])

Sleep Physionet dataset.

SIENA([root])

The Siena EEG Dataset.

SleepPhysionetChallenge2018([subject_ids, ...])

Physionet Challenge 2018 polysomnography dataset.

TUH(path[, recording_ids, target_name, ...])

Temple University Hospital (TUH) EEG Corpus (www.isip.piconepress.com/projects/tuh_eeg/html/downloads.shtml#c_tueg).

TUHAbnormal(path[, recording_ids, ...])

Temple University Hospital (TUH) Abnormal EEG Corpus.

Dataset Builders Functions#

Functions to create datasets from different data formats

create_from_X_y(X, y, drop_last_window, sfreq)

Create a BaseConcatDataset of WindowsDatasets from X and y to be used for decoding with skorch and braindecode, where X is a list of pre-cut trials and y are corresponding targets.

create_from_mne_raw(raws, ...[, ...])

Create WindowsDatasets from mne.RawArrays

create_from_mne_epochs(list_of_epochs, ...)

Create WindowsDatasets from mne.Epochs

Preprocessing#

braindecode.preprocessing:

Core Functions#

preprocess(concat_ds, preprocessors[, ...])

Apply preprocessors to a concat dataset.

Preprocessor(fn, *[, apply_on_array])

Preprocessor for an MNE Raw or Epochs object.

create_windows_from_events(concat_ds[, ...])

Create windows based on events in mne.Raw.

create_fixed_length_windows(concat_ds[, ...])

Windower that creates sliding windows.

create_windows_from_target_channels(concat_ds)

exponential_moving_demean(data[, ...])

Perform exponential moving demeanining.

exponential_moving_standardize(data[, ...])

Perform exponential moving standardization.

filterbank(raw, frequency_bands[, ...])

Applies multiple bandpass filters to the signals in raw.

EEGPrep Pipeline#

EEGPrep(*[, resample_to, flatline_maxdur, ...])

Preprocessor for an MNE Raw object that applies the EEGPrep pipeline.

ReinterpolateRemovedChannels(*[, ...])

Reinterpolate previously removed EEG channels to restore original channel set.

RemoveBadChannels(*[, corr_threshold, ...])

Removes EEG channels with problematic data; variant that uses channel locations.

RemoveBadChannelsNoLocs(*[, min_corr, ...])

Remove EEG channels with problematic data; variant that does not use channel locations.

RemoveBadWindows(*[, max_bad_channels, ...])

Remove periods with abnormally high-power content from continuous data.

RemoveBursts(*[, cutoff, window_len, ...])

Run the Artifact Subspace Reconstruction (ASR) method on EEG data to remove burst-type artifacts.

RemoveCommonAverageReference(*[, ...])

Subtracts the common average reference from the EEG data (EEGPrep version).

RemoveDCOffset(*[, can_change_duration, ...])

Remove the DC offset from the EEG data by subtracting the per-channel median.

RemoveDrifts([transition, attenuation, method])

Remove drifts from the EEG data using a forward-backward high-pass filter.

RemoveFlatChannels(*[, ...])

Removes EEG channels that flat-line for extended periods of time.

Signal Processing#

Resample(sfreq, *[, npad, window, ...])

Braindecode preprocessor wrapper for resample().

Resampling(sfreq)

Resample the data to a specified rate (EEGPrep version).

Filter(l_freq, h_freq[, picks, ...])

Braindecode preprocessor wrapper for filter().

FilterData(sfreq, l_freq, h_freq[, picks, ...])

Braindecode preprocessor wrapper for filter_data().

NotchFilter(freqs[, picks, filter_length, ...])

Braindecode preprocessor wrapper for notch_filter().

SavgolFilter(h_freq[, verbose])

Braindecode preprocessor wrapper for savgol_filter().

ApplyHilbert([picks, envelope, n_jobs, ...])

Braindecode preprocessor wrapper for apply_hilbert().

Rescale(scalings, *[, verbose])

Braindecode preprocessor wrapper for rescale().

OversampledTemporalProjection([duration, ...])

Braindecode preprocessor wrapper for oversampled_temporal_projection().

Channel Management#

Pick(picks[, exclude, verbose])

Braindecode preprocessor wrapper for pick().

PickChannels(ch_names[, ordered, verbose])

Braindecode preprocessor wrapper for pick_channels().

PickTypes([meg, eeg, stim, eog, ecg, emg, ...])

Braindecode preprocessor wrapper for pick_types().

DropChannels(ch_names[, on_missing])

Braindecode preprocessor wrapper for drop_channels().

AddChannels(add_list[, force_update_info])

Braindecode preprocessor wrapper for add_channels().

CombineChannels(groups[, method, keep_stim, ...])

Braindecode preprocessor wrapper for combine_channels().

RenameChannels(mapping[, allow_duplicates, ...])

Braindecode preprocessor wrapper for rename_channels().

ReorderChannels(ch_names)

Braindecode preprocessor wrapper for reorder_channels().

SetChannelTypes(mapping, *[, ...])

Braindecode preprocessor wrapper for set_channel_types().

InterpolateBads([reset_bads, mode, origin, ...])

Braindecode preprocessor wrapper for interpolate_bads().

InterpolateTo(sensors[, origin, method, reg])

Braindecode preprocessor wrapper for interpolate_to().

InterpolateBridgedElectrodes(bridged_idx[, ...])

Braindecode preprocessor wrapper for interpolate_bridged_electrodes().

ComputeBridgedElectrodes([lm_cutoff, ...])

Braindecode preprocessor wrapper for compute_bridged_electrodes().

EqualizeChannels([copy, verbose])

Braindecode preprocessor wrapper for equalize_channels().

EqualizeBads([interp_thresh, copy])

Braindecode preprocessor wrapper for equalize_bads().

FindBadChannelsLof([n_neighbors, picks, ...])

Braindecode preprocessor wrapper for find_bad_channels_lof().

Reference & Montage#

SetEEGReference([ref_channels, copy, ...])

Braindecode preprocessor wrapper for set_eeg_reference().

SetBipolarReference(anode, cathode[, ...])

Braindecode preprocessor wrapper for set_bipolar_reference().

AddReferenceChannels(ref_channels[, copy])

Braindecode preprocessor wrapper for add_reference_channels().

SetMontage(montage[, match_case, ...])

Braindecode preprocessor wrapper for set_montage().

SSP Projections#

AddProj(projs[, remove_existing, verbose])

Braindecode preprocessor wrapper for add_proj().

ApplyProj([verbose])

Braindecode preprocessor wrapper for apply_proj().

DelProj([idx])

Braindecode preprocessor wrapper for del_proj().

Data Transformation#

Crop([tmin, tmax, include_tmax, verbose])

Braindecode preprocessor wrapper for crop().

CropByAnnotations([annotations, verbose])

Braindecode preprocessor wrapper for crop_by_annotations().

ComputeCurrentSourceDensity([sphere, ...])

Braindecode preprocessor wrapper for compute_current_source_density().

FixStimArtifact([events, event_id, tmin, ...])

Braindecode preprocessor wrapper for fix_stim_artifact().

MaxwellFilter([origin, int_order, ...])

Braindecode preprocessor wrapper for maxwell_filter().

RealignRaw(other, t_raw, t_other, *[, verbose])

Braindecode preprocessor wrapper for realign_raw().

RegressArtifact([picks, exclude, ...])

Braindecode preprocessor wrapper for regress_artifact().

Artifact Detection & Annotation#

AnnotateAmplitude([peak, flat, bad_percent, ...])

Braindecode preprocessor wrapper for annotate_amplitude().

AnnotateBreak([events, min_break_duration, ...])

Braindecode preprocessor wrapper for annotate_break().

AnnotateMovement(pos[, ...])

Braindecode preprocessor wrapper for annotate_movement().

AnnotateMuscleZscore([threshold, ch_type, ...])

Braindecode preprocessor wrapper for annotate_muscle_zscore().

AnnotateNan(*[, verbose])

Braindecode preprocessor wrapper for annotate_nan().

Metadata & Configuration#

Anonymize([daysback, keep_his, verbose])

Braindecode preprocessor wrapper for anonymize().

SetAnnotations(annotations[, emit_warning, ...])

Braindecode preprocessor wrapper for set_annotations().

SetMeasDate(meas_date)

Braindecode preprocessor wrapper for set_meas_date().

AddEvents(events[, stim_channel, replace])

Braindecode preprocessor wrapper for add_events().

FixMagCoilTypes()

Braindecode preprocessor wrapper for fix_mag_coil_types().

ApplyGradientCompensation(grade[, verbose])

Braindecode preprocessor wrapper for apply_gradient_compensation().

Data Utils#

braindecode.datautil:

save_concat_dataset(path, concat_dataset[, ...])

load_concat_dataset(path, preload[, ...])

Load a stored BaseConcatDataset from files.

infer_signal_properties(X[, y, mode, classes])

Infers signal properties from the data.

Samplers#

Samplers that can used to sample EEG data for training and testing and to create batches of data, used on Self’Supervised Learning and other tasks.

braindecode.samplers:

RecordingSampler(metadata[, random_state])

Base sampler simplifying sampling from recordings.

DistributedRecordingSampler(metadata[, ...])

Base sampler simplifying sampling from recordings in distributed setting.

SequenceSampler(metadata, n_windows, ...[, ...])

Sample sequences of consecutive windows.

RelativePositioningSampler(metadata, ...[, ...])

Sample examples for the relative positioning task from [R0467437a2408-Banville2020].

DistributedRelativePositioningSampler(...[, ...])

Sample examples for the relative positioning task from [Rc4a232d8b33d-Banville2020] in distributed mode.

BalancedSequenceSampler(metadata, n_windows)

Balanced sampling of sequences of consecutive windows with categorical targets.

Augmentation#

The augmentation module follow the pytorch transforms API. It contains transformations that can be applied to EEG data. The transformations can be used to augment the data during training, which can help improve the performance of the model. The transformations can be applied to the data in a variety of ways, including time’domain transformations, frequency’domain transformations, and spatial transformations.

braindecode.augmentation:

Transform([probability, random_state])

Basic transform class used for implementing data augmentation operations.

IdentityTransform([probability, random_state])

Identity transform.

Compose(transforms)

Transform composition.

AugmentedDataLoader(dataset[, transforms, ...])

A base dataloader class customized to applying augmentation Transforms.

TimeReverse(probability[, random_state])

Flip the time axis of each input with a given probability.

SignFlip(probability[, random_state])

Flip the sign axis of each input with a given probability.

FTSurrogate(probability[, ...])

FT surrogate augmentation of a single EEG channel, as proposed in [Ra7c6c14d9bd9-1].

ChannelsShuffle(probability[, p_shuffle, ...])

Randomly shuffle channels in EEG data matrix.

ChannelsDropout(probability[, p_drop, ...])

Randomly set channels to flat signal.

GaussianNoise(probability[, std, random_state])

Randomly add white noise to all channels.

ChannelsSymmetry(probability, ordered_ch_names)

Permute EEG channels inverting left and right-side sensors.

SmoothTimeMask(probability[, ...])

Smoothly replace a randomly chosen contiguous part of all channels by zeros.

BandstopFilter(probability, sfreq[, ...])

Apply a band-stop filter with desired bandwidth at a randomly selected frequency position between 0 and max_freq.

FrequencyShift(probability, sfreq[, ...])

Add a random shift in the frequency domain to all channels.

SensorsRotation(probability, ...[, axis, ...])

Interpolates EEG signals over sensors rotated around the desired axis with an angle sampled uniformly between -max_degree and max_degree.

SensorsZRotation(probability, ordered_ch_names)

Interpolates EEG signals over sensors rotated around the Z axis with an angle sampled uniformly between -max_degree and max_degree.

SensorsYRotation(probability, ordered_ch_names)

Interpolates EEG signals over sensors rotated around the Y axis with an angle sampled uniformly between -max_degree and max_degree.

SensorsXRotation(probability, ordered_ch_names)

Interpolates EEG signals over sensors rotated around the X axis with an angle sampled uniformly between -max_degree and max_degree.

Mixup(alpha[, beta_per_sample, random_state])

Implements Iterator for Mixup for EEG data.

SegmentationReconstruction(probability[, ...])

Segmentation Reconstruction from Lotte (2015) [R78e7a66c7d6f-Lotte2015].

MaskEncoding(probability[, max_mask_ratio, ...])

MaskEncoding from [R9102599ed233-1].

AmplitudeScale(probability[, interval, ...])

Rescale amplitude based on a random sampled scaling value.

ChannelsReref(probability[, random_state])

Randomly re-reference channels in EEG data matrix.

The functional augmentation API contains the same transformations as the transforms API, but they are implemented as functions.

braindecode.augmentation.functional:

identity(X, y)

Identity operation.

time_reverse(X, y)

Flip the time axis of each input.

sign_flip(X, y)

Flip the sign axis of each input.

ft_surrogate(X, y, phase_noise_magnitude, ...)

FT surrogate augmentation of a single EEG channel, as proposed in [R52a4658fffa7-1].

channels_dropout(X, y, p_drop[, random_state])

Randomly set channels to flat signal.

channels_shuffle(X, y, p_shuffle[, random_state])

Randomly shuffle channels in EEG data matrix.

channels_permute(X, y, permutation)

Permute EEG channels according to fixed permutation matrix.

gaussian_noise(X, y, std[, random_state])

Randomly add white Gaussian noise to all channels.

smooth_time_mask(X, y, ...)

Smoothly replace a contiguous part of all channels by zeros.

bandstop_filter(X, y, sfreq, bandwidth, ...)

Apply a band-stop filter with desired bandwidth at the desired frequency position.

frequency_shift(X, y, delta_freq, sfreq)

Adds a shift in the frequency domain to all channels.

sensors_rotation(X, y, ...)

Interpolates EEG signals over sensors rotated around the desired axis with the desired angle.

mixup(X, y, lam, idx_perm)

Mixes two channels of EEG data.

segmentation_reconstruction(X, y, ...)

Segment and reconstruct EEG data from [Rc19448ba78ac-1].

mask_encoding(X, y, time_start, ...)

Mark encoding from Ding et al. (2024) from [Re49696d5b28b-ding2024].

amplitude_scale(X, y, scale[, random_state])

Rescale amplitude of each channel based on a random sampled scaling value.

channels_rereference(X, y[, random_state])

Randomly re-reference channels in EEG data matrix.

Classifier#

Skorch wrapper for braindecode models. The skorch wrapper allows to use braindecode models with scikit’learn API.

braindecode.classifier:

EEGClassifier(module, *args[, criterion, ...])

Classifier that does not assume softmax activation.

Regressor#

Skorch wrapper for braindecode models focus on regression tasks. The skorch wrapper allows to use braindecode models with scikit’learn API.

braindecode.regressor:

EEGRegressor(module, *args[, cropped, ...])

Regressor that calls loss function directly.

Training#

Training module contains functions and classes for training and evaluating EEG models. It is inside the Classifier and Regressor skorch classes, and it is used to train the models and evaluate their performance.

braindecode.training:

CroppedLoss(loss_function)

Compute Loss after averaging predictions across time.

TimeSeriesLoss(loss_function)

Compute Loss between timeseries targets and predictions.

CroppedTrialEpochScoring(scoring[, ...])

Class to compute scores for trials from a model that predicts (super)crops.

CroppedTimeSeriesEpochScoring(scoring[, ...])

Class to compute scores for trials from a model that predicts (super)crops with time series target.

PostEpochTrainScoring(scoring[, ...])

Epoch Scoring class that recomputes predictions after the epoch on the training in validation mode.

mixup_criterion(preds, target)

Implements loss for Mixup for EEG data.

trial_preds_from_window_preds(preds, ...)

Assigning window predictions to trials while removing duplicate predictions.

predict_trials(module, dataset[, ...])

Create trialwise predictions and optionally also return trialwise labels from cropped dataset given module.

Utils#

Functions available in braindecode util module.

braindecode.util:

set_random_seeds(seed, cuda[, cudnn_benchmark])

Set seeds for python random module numpy.random and torch.

Visualization#

Visualization module contains functions for visualizing EEG data, including plotting the confusion matrix and computing amplitude gradients. The visualization module is useful for understanding the performance of the model and for interpreting the results.

braindecode.visualization:

compute_amplitude_gradients(model, dataset, ...)

Compute amplitude gradients after seeding for reproducibility.

plot_confusion_matrix(confusion_mat[, ...])

Generates a confusion matrix with additional precision and sensitivity metrics as in [R8046536b33dd-1].