.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/plot_load_save_datasets.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here <sphx_glr_download_auto_examples_plot_load_save_datasets.py>` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_plot_load_save_datasets.py: Load and save dataset example ============================= In this example, we show how to load and save braindecode datasets. .. GENERATED FROM PYTHON SOURCE LINES 7-20 .. code-block:: default # Authors: Lukas Gemein <l.gemein@gmail.com> # # License: BSD (3-clause) import tempfile from braindecode.datasets import MOABBDataset from braindecode.preprocessing import preprocess, Preprocessor from braindecode.datautil import load_concat_dataset from braindecode.preprocessing import create_windows_from_events .. GENERATED FROM PYTHON SOURCE LINES 21-22 First, we load some dataset using MOABB. .. GENERATED FROM PYTHON SOURCE LINES 22-27 .. code-block:: default dataset = MOABBDataset( dataset_name='BNCI2014001', subject_ids=[1], ) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] .. GENERATED FROM PYTHON SOURCE LINES 28-30 We can apply preprocessing steps to the dataset. It is also possible to skip this step and not apply any preprocessing. .. GENERATED FROM PYTHON SOURCE LINES 30-35 .. code-block:: default preprocess( concat_ds=dataset, preprocessors=[Preprocessor(fn='resample', sfreq=10)] ) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] 48 events found Event IDs: [1 2 3 4] <braindecode.datasets.moabb.MOABBDataset object at 0x7f748ee50850> .. GENERATED FROM PYTHON SOURCE LINES 36-42 We save the dataset to a an existing directory. It will create a '.fif' file for every dataset in the concat dataset. Additionally it will create two JSON files, the first holding the description of the dataset, the second holding the name of the target. If you want to store to the same directory several times, for example due to trying different preprocessing, you can choose to overwrite the existing files. .. GENERATED FROM PYTHON SOURCE LINES 42-49 .. code-block:: default tmpdir = tempfile.mkdtemp() # write in a temporary directory dataset.save( path=tmpdir, overwrite=False, ) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Writing /tmp/tmp4_0_urks/0/0-raw.fif Closing /tmp/tmp4_0_urks/0/0-raw.fif [done] Writing /tmp/tmp4_0_urks/1/1-raw.fif Closing /tmp/tmp4_0_urks/1/1-raw.fif [done] Writing /tmp/tmp4_0_urks/2/2-raw.fif Closing /tmp/tmp4_0_urks/2/2-raw.fif [done] Writing /tmp/tmp4_0_urks/3/3-raw.fif Closing /tmp/tmp4_0_urks/3/3-raw.fif [done] Writing /tmp/tmp4_0_urks/4/4-raw.fif Closing /tmp/tmp4_0_urks/4/4-raw.fif [done] Writing /tmp/tmp4_0_urks/5/5-raw.fif Closing /tmp/tmp4_0_urks/5/5-raw.fif [done] Writing /tmp/tmp4_0_urks/6/6-raw.fif Closing /tmp/tmp4_0_urks/6/6-raw.fif [done] Writing /tmp/tmp4_0_urks/7/7-raw.fif Closing /tmp/tmp4_0_urks/7/7-raw.fif [done] Writing /tmp/tmp4_0_urks/8/8-raw.fif Closing /tmp/tmp4_0_urks/8/8-raw.fif [done] Writing /tmp/tmp4_0_urks/9/9-raw.fif Closing /tmp/tmp4_0_urks/9/9-raw.fif [done] Writing /tmp/tmp4_0_urks/10/10-raw.fif Closing /tmp/tmp4_0_urks/10/10-raw.fif [done] Writing /tmp/tmp4_0_urks/11/11-raw.fif Closing /tmp/tmp4_0_urks/11/11-raw.fif [done] .. GENERATED FROM PYTHON SOURCE LINES 50-56 We load the saved dataset from a directory. Signals can be preloaded in compliance with mne. Optionally, only specific '.fif' files can be loaded by specifying their ids. The target name can be changed, if the dataset supports it (TUHAbnormal for example supports 'pathological', 'age', and 'gender'. If you stored a preprocessed version with target 'pathological' it is possible to change the target upon loading). .. GENERATED FROM PYTHON SOURCE LINES 56-63 .. code-block:: default dataset_loaded = load_concat_dataset( path=tmpdir, preload=True, ids_to_load=[1, 3], target_name=None, ) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Opening raw data file /tmp/tmp4_0_urks/1/1-raw.fif... Range : 0 ... 3868 = 0.000 ... 386.800 secs Ready. Reading 0 ... 3868 = 0.000 ... 386.800 secs... Opening raw data file /tmp/tmp4_0_urks/3/3-raw.fif... Range : 0 ... 3868 = 0.000 ... 386.800 secs Ready. Reading 0 ... 3868 = 0.000 ... 386.800 secs... .. GENERATED FROM PYTHON SOURCE LINES 64-66 The serialization utility also supports WindowsDatasets, so we create compute windows next. .. GENERATED FROM PYTHON SOURCE LINES 66-74 .. code-block:: default windows_dataset = create_windows_from_events( concat_ds=dataset_loaded, trial_start_offset_samples=0, trial_stop_offset_samples=0, ) windows_dataset.description .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Used Annotations descriptions: ['feet', 'left_hand', 'right_hand', 'tongue'] Adding metadata with 4 columns Replacing existing metadata with 4 columns 48 matching events found No baseline correction applied 0 projection items activated Loading data for 48 events and 40 original time points ... 0 bad epochs dropped Used Annotations descriptions: ['feet', 'left_hand', 'right_hand', 'tongue'] Adding metadata with 4 columns Replacing existing metadata with 4 columns 48 matching events found No baseline correction applied 0 projection items activated Loading data for 48 events and 40 original time points ... 0 bad epochs dropped .. raw:: html <div class="output_subarea output_html rendered_html output_result"> <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>subject</th> <th>session</th> <th>run</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>1</td> <td>session_T</td> <td>run_1</td> </tr> <tr> <th>1</th> <td>1</td> <td>session_T</td> <td>run_3</td> </tr> </tbody> </table> </div> </div> <br /> <br /> .. GENERATED FROM PYTHON SOURCE LINES 75-81 Again, we save the dataset to an existing directory. It will create a '-epo.fif' file for every dataset in the concat dataset. Additionally it will create a JSON file holding the description of the dataset. If you want to store to the same directory several times, for example due to trying different windowing parameters, you can choose to overwrite the existing files. .. GENERATED FROM PYTHON SOURCE LINES 81-86 .. code-block:: default windows_dataset.save( path=tmpdir, overwrite=True, ) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Loading data for 1 events and 40 original time points ... Loading data for 48 events and 40 original time points ... Loading data for 1 events and 40 original time points ... Loading data for 48 events and 40 original time points ... /home/runner/work/braindecode/braindecode/braindecode/datasets/base.py:569: UserWarning: The number of saved datasets (2) does not match the number of existing subdirectories (12). You may now encounter a mix of differently preprocessed datasets! f"datasets!", UserWarning) /home/runner/work/braindecode/braindecode/braindecode/datasets/base.py:573: UserWarning: Chosen directory /tmp/tmp4_0_urks contains other subdirectories or files ['10', '8', '5', '7', '3', '11', '6', '2', '4', '9']. warnings.warn(f'Chosen directory {path} contains other ' .. GENERATED FROM PYTHON SOURCE LINES 87-90 Load the saved dataset from a directory. Signals can be preloaded in compliance with mne. Optionally, only specific '-epo.fif' files can be loaded by specifying their ids. .. GENERATED FROM PYTHON SOURCE LINES 90-98 .. code-block:: default windows_dataset_loaded = load_concat_dataset( path=tmpdir, preload=False, ids_to_load=[0], target_name=None, ) windows_dataset_loaded.description .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Reading /tmp/tmp4_0_urks/0/0-epo.fif ... Found the data of interest: t = 0.00 ... 3900.00 ms 0 CTF compensation matrices available Adding metadata with 4 columns Replacing existing metadata with 4 columns 48 matching events found No baseline correction applied 0 projection items activated .. raw:: html <div class="output_subarea output_html rendered_html output_result"> <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>subject</th> <th>session</th> <th>run</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>1</td> <td>session_T</td> <td>run_1</td> </tr> </tbody> </table> </div> </div> <br /> <br /> .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 4.979 seconds) **Estimated memory usage:** 408 MB .. _sphx_glr_download_auto_examples_plot_load_save_datasets.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_load_save_datasets.py <plot_load_save_datasets.py>` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_load_save_datasets.ipynb <plot_load_save_datasets.ipynb>` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery <https://sphinx-gallery.github.io>`_