Diff of /README.md [000000] .. [5d1c0a]

Switch to unified view

a b/README.md
1
Codes for adaptation of a subject-independent deep convolutional neural network (CNN) based electroencephalography (EEG)-BCI system for decoding hand motor imagery (MI). Five schemes are presented, each of which fine-tunes an extensively trained, pre-trained model and adapt it to enhance the evaluation performance on a target subject.
2
3
## Citation
4
```
5
@article{ZHANG20211,
6
title = "Adaptive transfer learning for EEG motor imagery classification with deep Convolutional Neural Network",
7
journal = "Neural Networks",
8
volume = "136",
9
pages = "1 - 10",
10
year = "2021",
11
issn = "0893-6080",
12
doi = "https://doi.org/10.1016/j.neunet.2020.12.013",
13
url = "http://www.sciencedirect.com/science/article/pii/S0893608020304305",
14
author = "Kaishuo Zhang and Neethu Robinson and Seong-Whan Lee and Cuntai Guan",
15
}
16
```
17
18
## Summary of Results
19
20
| Methodology | Mean (SD) | Median | Range (Max-Min) |
21
|-|-|-|-|
22
| Subject-Specific | 63.54 (14.25) | 60.50 | 57.00 (100.00-43.00) |
23
| Subject-Independent | 84.19 (9.98) | 84.50 | 47.50 (99.50-52.00) |
24
| Subject-Adaptive<br>(Scheme 4, 80%) | 86.89 (11.41) | 88.50 | 44.00 (100.00-56.00) |
25
26
A detailed subject-level result can be found in [result_table.pdf](result_table.pdf)
27
28
## Resources
29
- Raw Dataset: [Link](http://gigadb.org/dataset/100542)
30
- Sample pre-trained subject-independent model: [Link](https://github.com/zhangks98/eeg-adapt/tree/master/pretrained_models)
31
32
## Instructions
33
### Install the dependencies
34
It is recommended to create a virtual environment with python version 3.7 and activate it before running the following:
35
```sh
36
pip install -r requirements.txt
37
```
38
39
### Obtain the raw dataset
40
Download the raw dataset from the [resources](#resources) above, and save them to the same folder. To conserve space, you may only download files that ends with `EEG_MI.mat`.
41
42
### Pre-process the raw dataset
43
The following command will read the raw dataset from the `$source` folder, and output the pre-processed data `KU_mi_smt.h5` into the `$target` folder.
44
```sh
45
python preprocess_h5_smt.py $source $target
46
```
47
48
### Training and evaluation
49
#### Subject-specific classification
50
```
51
usage: train_within.py [-h] [-gpu GPU] [-start START] [-end END] [-subj SUBJ [SUBJ ...]] datapath outpath
52
53
Subject-specific classification with KU Data
54
55
positional arguments:
56
  datapath              Path to the h5 data file
57
  outpath               Path to the result folder
58
59
optional arguments:
60
  -h, --help            show this help message and exit
61
  -gpu GPU              The gpu device index to use
62
  -start START          Start of the subject index
63
  -end END              End of the subject index (not inclusive)
64
  -subj SUBJ [SUBJ ...]
65
                        Explicitly set the subject number. This will override the start and end argument
66
```
67
To train the subject-specific model for all subjects, run
68
```sh
69
python train_within.py $datapath $outpath
70
```
71
72
#### Subject-independent classification
73
```
74
usage: train_base.py [-h] -fold FOLD [-gpu GPU] datapath outpath
75
76
Subject independent classification with KU Data
77
78
positional arguments:
79
  datapath    Path to the h5 data file
80
  outpath     Path to the result folder
81
82
optional arguments:
83
  -h, --help  show this help message and exit
84
  -fold FOLD  k-fold index, starts with 0
85
  -gpu GPU    The gpu device to use
86
```
87
The `$fold` index has a one-to-one mapping to the subject index, as we have shuffled the subjects in a pre-defined order (using random seed 20200205). This is listed in [subj-to-fold.csv](subj-to-fold.csv) file.
88
89
To train the subject-independent model for all subjects, run
90
```sh
91
python train_base.py $datapath $outpath -fold $fold
92
```
93
for `$fold` ranging [0...53] inclusive.
94
95
This process is likely to take some time. We have provided some sample pre-trained models in the [resources](#resources) above.
96
97
#### Subject-adaptive classification
98
```
99
usage: train_adapt.py [-h] [-scheme SCHEME] [-trfrate TRFRATE] [-lr LR] [-gpu GPU] datapath modelpath outpath
100
101
Subject adaptative classification with KU Data
102
103
positional arguments:
104
  datapath          Path to the h5 data file
105
  modelpath         Path to the base model folder
106
  outpath           Path to the result folder
107
108
optional arguments:
109
  -h, --help        show this help message and exit
110
  -scheme SCHEME    Adaptation scheme
111
  -trfrate TRFRATE  The percentage of data for adaptation
112
  -lr LR            Learning rate
113
  -gpu GPU          The gpu device to use
114
```
115
As an example, to train the subject-adaptive model for all subjects using the default configuration (scheme 4, adaptation rate 100%, learning rate 0.0005), run: 
116
```sh
117
python train_adapt.py $datapath $modelpath $outpath
118
```
119
The `$modelpath` corresponds to the result folder of the [subject-independent classification](#subject-independent-classification), or the path to the pre-trained model.
120
121
### Training on multiple GPUs
122
To speed up training, you can make use of multiple GPUs. In the `tasks_*` folder, we provide a sample script for training on 8 GPUs. To generate these scripts, run:
123
```sh
124
python generate.py
125
```