Diff of /README.md [000000] .. [7b3b92]

Switch to unified view

a b/README.md
1
2
The source code for our paper "Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation"
3
4
Our paper can be found at [this link](https://ieeexplore.ieee.org/document/8964956).
5
6
## Overview
7
- [Dataset](#Dataset)
8
- [Pre-processing](#Pre-processing)
9
- [Architecture](#Architecture)
10
- [Training Process](#Training-Process)
11
- [Results](#Results)
12
- [Usage](#Usage)
13
14
### Dataset
15
The [BraTS](http://www.med.upenn.edu/sbia/brats2018.html) data set is used for training and evaluating the model. This dataset contains four modalities for each individual brain, namely, T1, T1c (post-contrast T1), T2, and Flair which were skull-stripped, resampled and coregistered. For more information, please refer to the main site.
16
17
### Pre-processing
18
For pre-processing the data, firstly, [N4ITK](https://ieeexplore.ieee.org/abstract/document/5445030) algorithm is adopted on each MRI modalities to correct the inhomogeneity of these images. Secondly, 1% of the top and bottom intensities is removed, and then each modality is normalized to zero mean and unit variance.
19
20
21
### Architecture
22
<br />
23
24
![image](https://github.com/Mehrdad-Noori/Brain-Tumor-Segmentation/blob/master/doc/model.jpg)
25
26
<br />
27
28
The network is based on U-Net architecture with some modifications as follows:
29
- The minor modifications: adding Residual Units, strided convolution, PReLU activation and Batch Normalization layers to the original U-Net
30
- The attention mechanism: employing [Squeeze and Excitation Block](https://arxiv.org/abs/1709.01507) (SE) on concatenated multi-level features. This technique prevents confusion for the model by weighting each of the channels adaptively (please refer to [our paper](https://ieeexplore.ieee.org/document/8964956) for more information).
31
32
<br />
33
34
<p align="left"><img src="https://github.com/Mehrdad-Noori/Brain-Tumor-Segmentation/blob/master/doc/attention.jpg" width="500" height="220"></p>
35
36
<br />
37
38
### Training Process
39
Since our proposed network is a 2D architecture, we need to extract 2D slices from 3D volumes of MRI images. To benefit from 3D contextual information of input images, we extract 2D slices from both Axial and Coronal views, and then train a network for each view separately. In the test time, we build the 3D output volume for each model by concatenating the 2D predicted maps. Finally, we fuse the two views by pixel-wise averaging.
40
41
<br />
42
43
<p align="left"><img src="https://github.com/Mehrdad-Noori/Brain-Tumor-Segmentation/blob/master/doc/MultiView.jpg" width="600" height="220"></p>
44
45
<br />
46
47
### Results
48
The results are obtained from the [BraTS online evaluation platform](https://ipp.cbica.upenn.edu/) using the BraTS 2018 validation set.
49
50
<br />
51
52
<p align="center"><img src="https://github.com/Mehrdad-Noori/Brain-Tumor-Segmentation/blob/master/doc/table.jpg" width="500" height="130"></p>
53
54
<br />
55
56
![image](https://github.com/Mehrdad-Noori/Brain-Tumor-Segmentation/blob/master/doc/example.jpg)
57
58
<br />
59
60
### Dependencies
61
- [numpy 1.17.4](https://numpy.org/)
62
- [nibabe l 3.0.1](https://nipy.org/nibabel/)
63
- [scipy 1.3.2](https://www.scipy.org/)
64
- [tables 3.6.1](https://www.pytables.org/)
65
- [Tensorflow 1.15.2](https://www.tensorflow.org/)
66
- [Keras 2.2.4](https://keras.io/)
67
68
### Usage
69
1- Download the BRATS 2019, 2018 or 2017 data by following the steps described in [BraTS](https://www.med.upenn.edu/cbica/brats2019/registration.html)
70
71
2- Perform N4ITK bias correction using [ANTs](https://github.com/ANTsX/ANTs), follow the steps in [this repo](https://github.com/ellisdg/3DUnetCNN) (this step is optional)
72
73
3- Set the path to all brain volumes in `config.py` (ex: `cfg['data_dir'] = './BRATS19/MICCAI_BraTS_2019_Data_Training/*/*/'`)
74
75
4- To read, preprocess and save all brain volumes into a single table file:
76
```
77
python prepare_data.py
78
```
79
80
5- To Run the training:
81
```
82
python train.py
83
```
84
The model can be trained from `axial`, `saggital` or `coronal` views (set `cfg['view']` in the `config.py`). Moreover, K-fold cross-validation can be used (set `cfg['k_fold']` in the `config.py`)
85
86
87
6- To predict and save label maps:
88
```
89
python predict.py
90
```
91
The predictions will be written in .nii.gz format and can be uploaded to [BraTS online evaluation platform](https://ipp.cbica.upenn.edu/).
92
93
### Citation
94
95
```
96
@inproceedings{noori2019attention,
97
  title={Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation},
98
  author={Noori, Mehrdad and Bahri, Ali and Mohammadi, Karim},
99
  booktitle={2019 9th International Conference on Computer and Knowledge Engineering (ICCKE)},
100
  pages={269--275},
101
  year={2019},
102
  organization={IEEE}
103
}
104
```