Skip to content

Commit 081e825

Browse files
updating readme file
1 parent 2f1b2a8 commit 081e825

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -88,9 +88,9 @@ This script is the first of the pre-processing scripts. It runs all the people i
8888
One of the issues we encountered was that some participants had their data collected using the wrong configuration file. This is taken care of.
8989
The data is down-sampled from 512Hz to 256 Hz.
9090
Externals are all deleted since not everyone has externals. So we cannot use them as a reference.
91-
We apply a 1Hz (filter order 1690) and 50Hz (filter order 136) filter.
91+
We apply a 1Hz and 50Hz filter.
9292
We add channel info to all the channel. For this we use the following 3 files: standard-10-5-cap385, BioSemi160, BioSemi64. The first 2 are from BESA and have the correct layout. The 3rd is needed for the MoBI data. You can find these in the Functions and files folder (inside the src folder).
93-
Lastly this script uses eeglab's clean_artifacts function deletes the bad channels. Channels will get deleted by the standard noise criteria, if they are flat over 5 seconds and the function checks if channels are overly correlated with each other. The function also deletes continues data if there is bad data. It breaks the data in 0.5 second steps. Data is bad if it off by 5 std dev. Which is a "A quite conservative" and default value according to the function.
93+
Lastly this script uses eeglab's clean_artifacts function deletes the bad channels and bad parts of data. Channels will get deleted by the standard noise criteria, if they are flat over 5 seconds and the function checks if channels are overly correlated with each other.After that it devides the data in 0.5 second epochs. In these it looks if there data has peaks that go over 20 standard deviantion of the channels amplitlude. If so these get deleted. Because it's possible that between 2 of these moments there is a very short amount of data, we make sure to delete all the continues data if between 2 boundries there is less then 2 seconds of data.
9494

9595
#### C_manual_check
9696
This script plots all the data in EEGlab as continues data and allows you to delete channels manually.

src/D_preprocces2.m

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030
for s=1:length(subject_list)
3131
data_path = [home_path subject_list{s} '\'];
3232
fprintf('\n\n\n**** %s: Loading dataset ****\n\n\n', subject_list{s});
33-
EEG = pop_loadset('filename', [subject_list{s} '_exchn.set'], 'filepath', data_path);
33+
EEG = pop_loadset('filename', [subject_list{s} '_exch.set'], 'filepath', data_path);
3434
trigger_info = 'Has trigger 50 and 51';
3535
amount_triggers = length(EEG.event);
3636
%% looking for people without triggers and sees if they at least have a logfile and thus if they were run with the paradigm

0 commit comments

Comments
 (0)