[e6696a]: / presentations / DeepEEG_CAN2019_Summary.txt

Download this file

51 lines (34 with data), 1.2 kB

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
DeepEEG summary - CAN 2019 - Toronto
----
Python
based on MNE raw and epochs objects
Loads
-Muse from eeg-notebooks, -Collab Example
-Various example data included
- simulated from MNE, - Collab Example
-simulated from real data
-time or frequency domain options
-change magnitude of effects and difference
Change signal to noise
-eye blinks
- raw or epoched data from BV and other amplifiers - Collab Example
-connect collab to google drive
-run locally on machine by downloading repo
New Class created out of epochs object - feats
epochs are created with various classic ERP methods custom
-Gratton eye movement correction
-mastoid rereferrence
Feats - time or frequency domain
-frequency domain - power or power+phase concatenated
-baseline or raw spectrograms
-time domain - filters, single trial ERP
-outputs X and Y to train models (data and labels)
-automatically shaped for input to model
-watermark option to test models
CreateModel
-Keras/TensorFlow
- high level abstracted, object oriented programming
- NN, CNN, CNN3D, LSTM, Auto, AutoDeep
TrainTestVal
-Test and Validation sets left out of training
-predicts binary class