|
a |
|
b/README.md |
|
|
1 |
# EEG Stress Detection |
|
|
2 |
Classification of stress using EEG recordings from the SAM 40 dataset. A description of the dataset can be found [here](https://www.sciencedirect.com/science/article/pii/S2352340921010465). |
|
|
3 |
|
|
|
4 |
## Files |
|
|
5 |
|
|
|
6 |
**dataset** |
|
|
7 |
|
|
|
8 |
Contains functions for loading and transforming the dataset |
|
|
9 |
|
|
|
10 |
```load_dataset(data_type="ica_filtered", test_type="Arithmetic")``` |
|
|
11 |
|
|
|
12 |
Loads data from the SAM 40 Dataset with the test specified by test_type. |
|
|
13 |
The data_type parameter specifies which of the datasets to load. Possible values are raw, wt_filtered, ica_filtered. |
|
|
14 |
Returns an ndarray with shape (120, 32, 3200). |
|
|
15 |
|
|
|
16 |
```load_labels()``` |
|
|
17 |
|
|
|
18 |
Loads labels from the dataset and transforms the label values to binary values. |
|
|
19 |
Values larger than 5 are set to 1 and values lower than or equal to 5 are set to zero. |
|
|
20 |
|
|
|
21 |
```format_labels(labels, test_type="Arithmetic", epochs=1)``` |
|
|
22 |
|
|
|
23 |
Filter the labels to keep the labels from the test type specified by test_type. |
|
|
24 |
Repeat the labels by the amount of epochs in a recording, specified by epochs. |
|
|
25 |
|
|
|
26 |
|
|
|
27 |
```split_data(dataset, sfreq)``` |
|
|
28 |
|
|
|
29 |
Splits EEG data into epochs with length 1 sec. |
|
|
30 |
|
|
|
31 |
|
|
|
32 |
**filtering** |
|
|
33 |
|
|
|
34 |
A notebook for filtering data using bandpass filtering, Savitzky-Golay filtering and ICA filtering. |
|
|
35 |
|
|
|
36 |
ICA components can be removed using visual inspection of the components to determine the ones corresponding to noise and artifacts, and selection can be performed using a GUI. |
|
|
37 |
|
|
|
38 |
The data can be saved to a directory to be used for classification. |
|
|
39 |
|
|
|
40 |
The filtering is performed using the [```mne``` package](https://mne.tools/stable/index.html) which is a Python package specialised in MEG and EEG analysis and visualisation. |
|
|
41 |
|
|
|
42 |
**features** |
|
|
43 |
|
|
|
44 |
```time_series_features(data)``` |
|
|
45 |
|
|
|
46 |
Compute the features peak-to-peak amplitude, variance and rms using the package mne_features. |
|
|
47 |
The data should be on the form (n_trials, n_secs, n_channels, sfreq) |
|
|
48 |
The output is on the form (n_trials\*n_secs, n_channels\*n_features) |
|
|
49 |
|
|
|
50 |
```freq_band_features(data, freq_bands)``` |
|
|
51 |
|
|
|
52 |
Compute the frequency bands delta, theta, alpha, beta and gamma using the package mne_features. |
|
|
53 |
The data should be on the form (n_trials, n_secs, n_channels, sfreq) |
|
|
54 |
The output is on the form (n_trials\*n_secs, n_channels\*n_features) |
|
|
55 |
|
|
|
56 |
```hjorth_features(data)``` |
|
|
57 |
|
|
|
58 |
Compute the features Hjorth mobility (spectral) and Hjorth complexity (spectral) using the package mne_features. |
|
|
59 |
The data should be on the form (n_trials, n_secs, n_channels, sfreq) |
|
|
60 |
The output is on the form (n_trials\*n_secs, n_channels\*n_features) |
|
|
61 |
|
|
|
62 |
```fractal_features(data)``` |
|
|
63 |
|
|
|
64 |
Compute the Higuchi Fractal Dimension and Katz Fractal Dimension using the package mne_features. |
|
|
65 |
The data should be on the form (n_trials, n_secs, n_channels, sfreq) |
|
|
66 |
The output is on the form (n_trials\*n_secs, n_channels\*n_features) |
|
|
67 |
|
|
|
68 |
```entropy_features(data)``` |
|
|
69 |
|
|
|
70 |
Compute the features Approximate Entropy, Sample Entropy, Spectral Entropy and SVD entropy using the package mne_features. |
|
|
71 |
The data should be on the form (n_trials, n_secs, n_channels, sfreq) |
|
|
72 |
The output is on the form (n_trials\*n_secs, n_channels\*n_features) |
|
|
73 |
|
|
|
74 |
**classification** |
|
|
75 |
|
|
|
76 |
Classification using features loaded from **features**. Uses a KNN classifier, SVM classifier and an MLP to classify. |
|
|
77 |
|
|
|
78 |
**variables** |
|
|
79 |
|
|
|
80 |
Script containing the global variables used in the project. |