braindecode.datasets.create_from_mne_raw#
- braindecode.datasets.create_from_mne_raw(raws, trial_start_offset_samples, trial_stop_offset_samples, window_size_samples, window_stride_samples, drop_last_window, descriptions=None, mapping=None, preload=False, drop_bad_windows=True, accepted_bads_ratio=0.0)[source]#
Create WindowsDatasets from mne.RawArrays
- Parameters
raws (array-like) – list of mne.RawArrays
trial_start_offset_samples (int) – start offset from original trial onsets in samples
trial_stop_offset_samples (int) – stop offset from original trial stop in samples
window_size_samples (int) – window size
window_stride_samples (int) – stride between windows
drop_last_window (bool) – whether or not have a last overlapping window, when windows do not equally divide the continuous signal
descriptions (array-like) – list of dicts or pandas.Series with additional information about the raws
mapping (dict(str: int)) – mapping from event description to target value
preload (bool) – if True, preload the data of the Epochs objects.
drop_bad_windows (bool) – If True, call .drop_bad() on the resulting mne.Epochs object. This step allows identifying e.g., windows that fall outside of the continuous recording. It is suggested to run this step here as otherwise the BaseConcatDataset has to be updated as well.
accepted_bads_ratio (float, optional) – Acceptable proportion of trials withinconsistent length in a raw. If the number of trials whose length is exceeded by the window size is smaller than this, then only the corresponding trials are dropped, but the computation continues. Otherwise, an error is raised. Defaults to 0.0 (raise an error).
- Returns
windows_datasets – X and y transformed to a dataset format that is compativle with skorch and braindecode
- Return type