|
a |
|
b/README.md |
|
|
1 |
# ECG Arrhythmia classification |
|
|
2 |
|
|
|
3 |
 |
|
|
4 |
|
|
|
5 |
The repository contains code for Master's degree dissertation - |
|
|
6 |
**Diagnosis of Diseases by ECG Using Convolutional Neural Networks**. |
|
|
7 |
Only CNN neural network models are considered in the paper and the repository. |
|
|
8 |
As a part of the work, more than 30 experiments have been run. |
|
|
9 |
The table with all experiments and their metrics is available by the [link](https://docs.google.com/spreadsheets/d/1fmLNC1_1xohEUEoNRodkIoFHnJI3O4HfCHN8Ys_b7nE) |
|
|
10 |
|
|
|
11 |
The best **1D** and **2D** CNN models are presented in the repository |
|
|
12 |
The repository follows _config_ principle and can be run in the following modes: |
|
|
13 |
|
|
|
14 |
- Training - use `train.py --config configs/training/<config>.json` to train the model |
|
|
15 |
- Validation - use `inference.py --config configs/inference/config.json` to validate the model |
|
|
16 |
- Pipeline - use `pipeline.py --config configs/pipelines/config/json` to test the model using ECG data (i.e. data generation, running, visualization the results) |
|
|
17 |
|
|
|
18 |
All available models and all necessary information are described below |
|
|
19 |
|
|
|
20 |
**Python 3.7** and **PyTorch** are used in the project |
|
|
21 |
**GitHub actions** are used for installing dependencies and training implemented models |
|
|
22 |
|
|
|
23 |
Program - **Data Mining** |
|
|
24 |
Department - **Computer Science** |
|
|
25 |
|
|
|
26 |
Principal Investigator - **Nikolai Yu. Zolotykh** |
|
|
27 |
_National Research University - Higher School of Economics_ |
|
|
28 |
|
|
|
29 |
## Implemented models |
|
|
30 |
|
|
|
31 |
#### 1D models: |
|
|
32 |
|
|
|
33 |
- [Cardiologist-Level Arrhythmia Detection with Convolutional Neural Networks](https://arxiv.org/abs/1707.01836)  |
|
|
34 |
- [ECG Heartbeat Classification Using Convolutional Neural Networks](https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8952723)  |
|
|
35 |
- [Electrocardiogram Generation and Feature Extraction Using a Variational Autoencoder](https://arxiv.org/pdf/2002.00254.pdf) (encoder only)  |
|
|
36 |
- **Author's EcgResNet34**  |
|
|
37 |
|
|
|
38 |
#### 2D models: |
|
|
39 |
|
|
|
40 |
- [ECG arrhythmia classification using a 2-D convolutional neural network](https://arxiv.org/abs/1804.06812)  |
|
|
41 |
- MobileNetV2  |
|
|
42 |
- EfficientNetB4  |
|
|
43 |
|
|
|
44 |
#### Metrics |
|
|
45 |
|
|
|
46 |
| **name** | **type** | **model** | **accuracy** | **val loss** | |
|
|
47 |
| -------- | ----------------------------------------------- | ------------------------------------------------------------ | ------------ | ------------ | |
|
|
48 |
| exp-025 | 1D (1x128) - [PEAK[t] - 64, PEAK[t] + 64] | https://arxiv.org/pdf/1707.01836.pdf | 0,9827 | 0,0726 | |
|
|
49 |
| exp-030 | 1D (1x128) - [PEAK[t] - 64, PEAK[t] + 64] | https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8952723 | 0,9864 | 1,5 | |
|
|
50 |
| exp-031 | 1D (1x128) - [PEAK[t] - 64, PEAK[t] + 64] | https://arxiv.org/pdf/2002.00254.pdf | 0,9886 | 0,15 | |
|
|
51 |
| exp-018 | 2D (128x128) - [PEAK[t] - 64, PEAK[t] + 64] | https://arxiv.org/pdf/1804.06812.pdf | 0,9920 | 0,1 | |
|
|
52 |
| exp-013 | 2D (128x128) - [PEAK[t] - 64, PEAK[t] + 64] | MobileNetV2 | 0,9934 | 0,088 | |
|
|
53 |
| exp-021 | 2D (128x128) - [PEAK[t-1] + 20, PEAK[t+1] - 20] | EfficientNetB4 | 0,9935 | 0,062 | |
|
|
54 |
| exp-029 | 1D (1x128) - [PEAK[t] - 64, PEAK[t] + 64] | Novel EcgResNet34 | **0,9938** | **0,0500** | |
|
|
55 |
|
|
|
56 |
## Getting started |
|
|
57 |
|
|
|
58 |
Training quick start: |
|
|
59 |
|
|
|
60 |
1. [Download](https://physionet.org/static/published-projects/mitdb/mit-bih-arrhythmia-database-1.0.0.zip) |
|
|
61 |
and unzip files into `mit-bih` directory |
|
|
62 |
2. Install requirements via `pip install -r requirements.txt` |
|
|
63 |
3. Generate 1D and 2D data files running `cd scripts && python dataset-generation-pool.py` |
|
|
64 |
4. Create `json` annotation files |
|
|
65 |
- For 1D model - `cd scripts && python annotation-generation-1d.py` |
|
|
66 |
- For 2D model - `cd scripts && python annotation-generation-2d.py` |
|
|
67 |
5. Run training - `python train.py --config configs/training/<config>.json` |
|
|
68 |
|
|
|
69 |
See [CI examples](https://github.com/lxdv/ecg-classification/actions) for each model |
|
|
70 |
|
|
|
71 |
## Testing and visualization |
|
|
72 |
|
|
|
73 |
_Using EcgResNet34 model as it shows the best metrics_ |
|
|
74 |
|
|
|
75 |
1. Install requirements via pip install -r requirements.txt |
|
|
76 |
2. Create directory named `experiments` |
|
|
77 |
3. [Download](https://drive.google.com/file/d/1wCy9Y4EQmI3gdVTX77U7ZXa5zPaqLQ5S/view?usp=sharing) the archive and unzip its content into `experiments` directory |
|
|
78 |
4. Download [WFDB format](https://www.physionet.org/physiotools/wpg/wpg_35.htm) data |
|
|
79 |
5. Change `ecg_data` path in `configs/pipelines/config.json` **with no extension** |
|
|
80 |
|
|
|
81 |
``` |
|
|
82 |
{ |
|
|
83 |
... |
|
|
84 |
"ecg_data": "./mit-bih/100", |
|
|
85 |
... |
|
|
86 |
} |
|
|
87 |
``` |
|
|
88 |
|
|
|
89 |
6. Run pipeline - `python pipeline.py --config configs/pipelines/config.json` |
|
|
90 |
|
|
|
91 |
The results will be saved as HTML file in `experiments/EcgResNet34/results` directory |
|
|
92 |
|
|
|
93 |
 |
|
|
94 |
|
|
|
95 |
## Experiments |
|
|
96 |
|
|
|
97 |
The code of all experiments described in the [table](https://docs.google.com/spreadsheets/d/1fmLNC1_1xohEUEoNRodkIoFHnJI3O4HfCHN8Ys_b7nE) |
|
|
98 |
is in branches **experiments/exp-XXX** |
|
|
99 |
|
|
|
100 |
## Other |
|
|
101 |
|
|
|
102 |
The repository contains Jupyter Notebooks (see `notebooks` folder) |
|
|
103 |
|
|
|
104 |
 |
|
|
105 |
|
|
|
106 |
## Contributors |
|
|
107 |
|
|
|
108 |
- [Alexander Lyashuk](mailto:lyashuk.me@gmail.com) |
|
|
109 |
- [Nikolai Zolotykh](mailto:nikolai.zolotykh@gmail.com) |
|
|
110 |
|
|
|
111 |
## Support |
|
|
112 |
|
|
|
113 |
Please give a ⭐️ if this project helped you |
|
|
114 |
|
|
|
115 |
## License |
|
|
116 |
|
|
|
117 |
This project is licensed under the [MIT](LICENCE) License |