|
5 | 5 |
|
6 | 6 | EEGDash example for eyes open vs. closed classification. |
7 | 7 |
|
8 | | -The code below provides an example of using the *EEGDash* library in combination with PyTorch to develop a deep learning model for analyzing EEG data, specifically for eyes open vs. closed classification in a single subject. |
| 8 | +This example uses the :mod:`eegdash` library in combination with PyTorch to develop a deep learning model for analyzing EEG data, specifically for eyes open vs. closed classification in a single subject. |
9 | 9 |
|
10 | | -1. **Data Retrieval Using EEGDash**: An instance of *EEGDashDataset* is created to search and retrieve an EEG dataset. At this step, only the metadata is transferred. |
| 10 | +1. **Data Retrieval Using EEGDash**: An instance of :class:`eegdash.api.EEGDashDataset` is created to search and retrieve an EEG dataset. At this step, only the metadata is transferred. |
11 | 11 |
|
12 | 12 | 2. **Data Preprocessing Using BrainDecode**: This process preprocesses EEG data using Braindecode by reannotating events, selecting specific channels, resampling, filtering, and extracting 2-second epochs, ensuring balanced eyes-open and eyes-closed data for analysis. |
13 | 13 |
|
|
22 | 22 | # Data Retrieval Using EEGDash |
23 | 23 | # ---------------------------- |
24 | 24 | # |
| 25 | +# This section instantiates :class:`eegdash.api.EEGDashDataset` to fetch |
| 26 | +# the metadata for the experiment before requesting any recordings. |
| 27 | +# |
25 | 28 | # First we find one resting state dataset. This dataset contains both eyes open |
26 | 29 | # and eyes closed data. |
27 | 30 | from pathlib import Path |
|
39 | 42 | # Data Preprocessing Using Braindecode |
40 | 43 | # ------------------------------------ |
41 | 44 | # |
42 | | -# [BrainDecode](https://braindecode.org/stable/install/install.html) is a |
| 45 | +# `braindecode <https://braindecode.org/stable/install/install.html>`__ is a |
43 | 46 | # specialized library for preprocessing EEG and MEG data. In this dataset, there |
44 | 47 | # are two key events in the continuous data: **instructed_toCloseEyes**, marking |
45 | 48 | # the start of a 40-second eyes-closed period, and **instructed_toOpenEyes**, |
|
49 | 52 | # after the event onset. Similarly, for the eyes-open event, we extract data |
50 | 53 | # from 5 to 19 seconds after the event onset. This ensures an equal amount of |
51 | 54 | # data for both conditions. The event extraction is handled by the custom |
52 | | -# function **hbn_ec_ec_reannotation**. |
| 55 | +# function :func:`eegdash.hbn.preprocessing.hbn_ec_ec_reannotation`. |
53 | 56 | # |
54 | 57 | # Next, we apply four preprocessing steps in Braindecode: |
55 | | -# 1. **Reannotation** of event markers using `hbn_ec_ec_reannotation()`. |
| 58 | +# 1. **Reannotation** of event markers using :func:`eegdash.hbn.preprocessing.hbn_ec_ec_reannotation`. |
56 | 59 | # 2. **Selection** of 24 specific EEG channels from the original 128. |
57 | 60 | # 3. **Resampling** the EEG data to a frequency of 128 Hz. |
58 | 61 | # 4. **Filtering** the EEG signals to retain frequencies between 1 Hz and 55 Hz. |
|
0 commit comments