Diff of /README.md [000000] .. [195f5e]

Switch to unified view

a b/README.md
1
# Introduction
2
This is the Army Research Laboratory (ARL) EEGModels project: A Collection of Convolutional Neural Network (CNN) models for EEG signal processing and classification, written in Keras and Tensorflow. The aim of this project is to
3
4
- provide a set of well-validated CNN models for EEG signal processing and classification
5
- facilitate reproducible research and
6
- enable other researchers to use and compare these models as easy as possible on their data
7
8
# Requirements
9
10
- Python == 3.7 or 3.8
11
- tensorflow == 2.X (verified working with 2.0 - 2.3, both for CPU and GPU)
12
13
To run the EEG/MEG ERP classification sample script, you will also need
14
15
- mne >= 0.17.1
16
- PyRiemann >= 0.2.5
17
- scikit-learn >= 0.20.1
18
- matplotlib >= 2.2.3
19
20
# Models Implemented
21
22
- EEGNet [[1]](http://stacks.iop.org/1741-2552/15/i=5/a=056013). Both the original model and the revised model are implemented.
23
- EEGNet variant used for classification of Steady State Visual Evoked Potential (SSVEP) Signals [[2]](http://iopscience.iop.org/article/10.1088/1741-2552/aae5d8)
24
- DeepConvNet [[3]](https://onlinelibrary.wiley.com/doi/full/10.1002/hbm.23730)
25
- ShallowConvNet [[3]](https://onlinelibrary.wiley.com/doi/full/10.1002/hbm.23730)
26
27
28
# Usage
29
30
To use this package, place the contents of this folder in your PYTHONPATH environment variable. Then, one can simply import any model and configure it as
31
32
33
```python
34
35
from EEGModels import EEGNet, ShallowConvNet, DeepConvNet
36
37
model  = EEGNet(nb_classes = ..., Chans = ..., Samples = ...)
38
39
model2 = ShallowConvNet(nb_classes = ..., Chans = ..., Samples = ...)
40
41
model3 = DeepConvNet(nb_classes = ..., Chans = ..., Samples = ...)
42
43
```
44
45
Compile the model with the associated loss function and optimizer (in our case, the categorical cross-entropy and Adam optimizer, respectively). Then fit the model and predict on new test data.
46
47
```python
48
49
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam')
50
fittedModel    = model.fit(...)
51
predicted      = model.predict(...)
52
53
```
54
55
# EEGNet Feature Explainability
56
57
Note: Please see https://github.com/vlawhern/arl-eegmodels/issues/29 for additional steps needed to get this to work with Tensorflow 2.
58
59
To reproduce the EEGNet single-trial feature relevance results as we reported in [[1]](http://stacks.iop.org/1741-2552/15/i=5/a=056013), download and install DeepExplain located [[here]](https://github.com/marcoancona/DeepExplain), which implements a variety of relevance attribution methods (both gradient-based and perturbation-based). A sketch of how to use it is given below:
60
61
```python
62
from EEGModels import EEGNet
63
from tensorflow.keras.models import Model
64
from deepexplain.tensorflow import DeepExplain
65
from tensorflow.keras import backend as K
66
67
# configure, compile and fit the model
68
 
69
model          = EEGNet(nb_classes = ..., Chans = ..., Samples = ...)
70
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam')
71
fittedModel    = model.fit(...)
72
73
# use DeepExplain to get individual trial feature relevances for some test data (X_test, Y_test). 
74
# Note that model.layers[-2] points to the dense layer prior to softmax activation. Also, we use
75
# the DeepLIFT method in the paper, although other options, including epsilon-LRP, are available.
76
# This works with all implemented models. 
77
78
# here, Y_test and X_test are the one-hot encodings of the class labels and
79
# the data, respectively. 
80
81
with DeepExplain(session = K.get_session()) as de:
82
    input_tensor   = model.layers[0].input
83
    fModel         = Model(inputs = input_tensor, outputs = model.layers[-2].output)    
84
    target_tensor  = fModel(input_tensor)    
85
86
    # can use epsilon-LRP as well if you like.
87
    attributions   = de.explain('deeplift', target_tensor * Y_test, input_tensor, X_test)
88
    # attributions = de.explain('elrp', target_tensor * Y_test, input_tensor, X_test)   
89
90
91
```
92
93
94
# Paper Citation
95
96
If you use the EEGNet model in your research and found it helpful, please cite the following paper:
97
98
```
99
@article{Lawhern2018,
100
  author={Vernon J Lawhern and Amelia J Solon and Nicholas R Waytowich and Stephen M Gordon and Chou P Hung and Brent J Lance},
101
  title={EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces},
102
  journal={Journal of Neural Engineering},
103
  volume={15},
104
  number={5},
105
  pages={056013},
106
  url={http://stacks.iop.org/1741-2552/15/i=5/a=056013},
107
  year={2018}
108
}
109
```
110
111
If you use the SSVEP variant of the EEGNet model in your research and found it helpful, please cite the following paper:
112
113
```
114
@article{Waytowich2018,
115
  author={Nicholas Waytowich and Vernon J Lawhern and Javier O Garcia and Jennifer Cummings and Josef Faller and Paul Sajda and Jean M
116
Vettel},
117
  title={Compact convolutional neural networks for classification of asynchronous steady-state visual evoked potentials},
118
  journal={Journal of Neural Engineering},
119
  volume={15},
120
  number={6},
121
  pages={066031},
122
  url={http://stacks.iop.org/1741-2552/15/i=6/a=066031},
123
  year={2018}
124
}
125
    
126
```
127
128
Similarly, if you use the ShallowConvNet or DeepConvNet models and found them helpful, please cite the following paper:
129
130
```
131
@article{hbm23730,
132
author = {Schirrmeister Robin Tibor and 
133
          Springenberg Jost Tobias and 
134
          Fiederer Lukas Dominique Josef and 
135
          Glasstetter Martin and 
136
          Eggensperger Katharina and 
137
          Tangermann Michael and 
138
          Hutter Frank and 
139
          Burgard Wolfram and 
140
          Ball Tonio},
141
title = {Deep learning with convolutional neural networks for EEG decoding and visualization},
142
journal = {Human Brain Mapping},
143
volume = {38},
144
number = {11},
145
pages = {5391-5420},
146
keywords = {electroencephalography, EEG analysis, machine learning, end‐to‐end learning, brain–machine interface, brain–computer interface, model interpretability, brain mapping},
147
doi = {10.1002/hbm.23730},
148
url = {https://onlinelibrary.wiley.com/doi/abs/10.1002/hbm.23730}
149
}
150
```
151
152
# Legal Disclaimer
153
154
This project is governed by the terms of the Creative Commons Zero 1.0 Universal (CC0 1.0) Public Domain Dedication (the Agreement). You should have received a copy of the Agreement with a copy of this software. If not, see https://github.com/USArmyResearchLab/ARLDCCSO. Your use or distribution of ARL EEGModels, in both source and binary form, in whole or in part, implies your agreement to abide by the terms set forth in the Agreement in full. 
155
 
156
Other portions of this project are subject to domestic copyright protection under 17 USC Sec. 105.  Those portions are licensed under the Apache 2.0 license.  The complete text of the license governing this material is in the file labeled LICENSE.TXT that is a part of this project's official distribution. 
157
158
arl-eegmodels is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. 
159
160
You may find the full license in the file LICENSE in this directory.
161
162
# Contributions
163
164
Due to legal issues, every contributor will need to have a signed Contributor License Agreement on file. The ARL Contributor License Agreement (ARL Form 266) can be found [here](https://github.com/USArmyResearchLab/ARL-Open-Source-Guidance-and-Instructions/blob/master/ARL%20Form%20-%20266.pdf). 
165
166
Each external contributor must execute and return a copy for each project that he or she intends to contribute to. 
167
168
Once ARL receives the executed form, it will remain in force permanently. 
169
170
Thus, external contributors need only execute the form once for each project that they plan on contributing to.
171
172