|
a |
|
b/README.md |
|
|
1 |
# MRI Brain Tumor Segmentation and Uncertainty Estimation using 3D-Unet architectures on BraTS'20 |
|
|
2 |
|
|
|
3 |
This repository contains the code of the work presented in the paper |
|
|
4 |
[MRI Brain Tumor Segmentation and Uncertainty Estimation using 3D-Unet architectures](https://arxiv.org/abs/2012.15294) |
|
|
5 |
which is used to participate on the BraTS'20 challenge on Brain Tumor Segmentation, for tasks 1 and 3. |
|
|
6 |
|
|
|
7 |
This work proposes the usage of V-Net and 3D-UNet based models for semantic segmentation in 3D-MRI Brain Tumor Segmentation and identifies certain and uncertain predictions at test time. |
|
|
8 |
|
|
|
9 |
The original repository can be found [here](https://github.com/LauraMoraB/BrainTumorSegmentation). |
|
|
10 |
|
|
|
11 |
## Repository Structure |
|
|
12 |
|__ resources/ |
|
|
13 |
|__ config.ini |
|
|
14 |
|__ src/ |
|
|
15 |
|__ dataset/ |
|
|
16 |
|__ losses/ |
|
|
17 |
|__ ensemble/ |
|
|
18 |
|__ metrics/ |
|
|
19 |
|__ post_processing/ |
|
|
20 |
|__ test/ |
|
|
21 |
|__ train/ |
|
|
22 |
|__ uncertainty/ |
|
|
23 |
|__ config.py |
|
|
24 |
|__ logging_conf.py |
|
|
25 |
|__ train.py |
|
|
26 |
|__ inference.py |
|
|
27 |
|__ normalize_uncertainty.py |
|
|
28 |
|__ run_post_processing.py |
|
|
29 |
|
|
|
30 |
|__ tests/ |
|
|
31 |
|__ README.md |
|
|
32 |
|
|
|
33 |
## Dataset structure |
|
|
34 |
|
|
|
35 |
The dataset used in this repository is the official one provided by BraTS20 for training, validation and test. |
|
|
36 |
|
|
|
37 |
For each patient, they provide a folder with the following files (`*_seg.nii.gz` is only provided for the training set) |
|
|
38 |
``` |
|
|
39 |
BraTS20_Training_001/ |
|
|
40 |
BraTS20_Training_001_flair.nii.gz |
|
|
41 |
BraTS20_Training_001_seg.nii.gz |
|
|
42 |
BraTS20_Training_001_t1.nii.gz |
|
|
43 |
BraTS20_Training_001_t1ce.nii.gz |
|
|
44 |
BraTS20_Training_001_t2.nii.gz |
|
|
45 |
``` |
|
|
46 |
|
|
|
47 |
In this project, the data is expected to be separated by the three different sets and by sampling technique |
|
|
48 |
(in case the sampling may be computed beforehand). As an example: |
|
|
49 |
``` |
|
|
50 |
* Train: ~/train/source_sampling/BraTS20_Training_00*/*.nii.gz |
|
|
51 |
* Validation: ~/validation/source_sampling/BraTS20_Training_00*/*.nii.gz |
|
|
52 |
* Test: ~/test/source_sampling/BraTS20_Training_00*/*.nii.gz |
|
|
53 |
``` |
|
|
54 |
|
|
|
55 |
It also requires the `brats20_data.csv` which has the following information: |
|
|
56 |
|
|
|
57 |
| ID | Grade | subject_ID | Center | Patch | Size | Train | |
|
|
58 |
| -- | ----- | ---------- | ------ | ----- | ---- | ----- | |
|
|
59 |
| 1 | HGG | BraTS20_Training_001 | CBICA | BraTS20_Training_001 | 240x240x155 | train | |
|
|
60 |
| 2 | LGG | BraTS20_Training_270 | TMC | BraTS20_Training_270 | 240x240x155 | test | |
|
|
61 |
|
|
|
62 |
The *Train* column is used to select some samples for testing. |
|
|
63 |
|
|
|
64 |
## Installation |
|
|
65 |
|
|
|
66 |
``` |
|
|
67 |
pip install -r requirements.txt |
|
|
68 |
``` |
|
|
69 |
|
|
|
70 |
## Execution |
|
|
71 |
|
|
|
72 |
You can execute several processes, from the training of the model, inference, inference uncertainty, run post processing to the obtained results, compute metrics and compute an ensemble. |
|
|
73 |
|
|
|
74 |
All scripts run similarly, as all the required configuration is read from the config.ini file. |
|
|
75 |
``` |
|
|
76 |
python <script.py> resources/config.ini |
|
|
77 |
``` |
|
|
78 |
|
|
|
79 |
However, the `run_post_processing.py` is though to be run with SLURM arrays, so it will need editing in case you don't have a SLURM environment. |
|
|
80 |
|
|
|
81 |
### Training |
|
|
82 |
|
|
|
83 |
``` |
|
|
84 |
python train.py resources/config.ini |
|
|
85 |
``` |
|
|
86 |
|
|
|
87 |
#### Network |
|
|
88 |
Four Possible Networks: |
|
|
89 |
* Basic VNet : vnet |
|
|
90 |
* Deeper VNet: vnet_assym |
|
|
91 |
* Basic 3DUNet: 3dunet |
|
|
92 |
* Residual 3DUNet: 3dunet_residual |
|
|
93 |
|
|
|
94 |
```ini |
|
|
95 |
n_epochs: 100 |
|
|
96 |
|
|
|
97 |
init_features_maps: 32 |
|
|
98 |
network: 3dunet_residual or 3dunet or vnet_asymm or vnet |
|
|
99 |
|
|
|
100 |
# unet based |
|
|
101 |
unet_order: crg |
|
|
102 |
# cli - conv + LeakyReLU + instancenorm |
|
|
103 |
|
|
|
104 |
# vnet asymm |
|
|
105 |
non_linearity: relu |
|
|
106 |
kernel_size: 3 |
|
|
107 |
padding: 1 |
|
|
108 |
|
|
|
109 |
# vnet |
|
|
110 |
use_elu: true |
|
|
111 |
``` |
|
|
112 |
|
|
|
113 |
#### Optimizer |
|
|
114 |
|
|
|
115 |
Implemented: ADAM or SGD |
|
|
116 |
```ini |
|
|
117 |
optimizer: ADAM |
|
|
118 |
learning_rate: 1e-4 |
|
|
119 |
weight_decay: 1e-5 |
|
|
120 |
# sgd only |
|
|
121 |
momentum: 0.99 |
|
|
122 |
``` |
|
|
123 |
|
|
|
124 |
#### Loss |
|
|
125 |
|
|
|
126 |
* Loss can be evaluated on ET/TC/WT (`eval_regions: true`) or ED/NCR/ET (`eval_regions: false`) |
|
|
127 |
* Loss: dice, both_dice (dice eval + dice normal), gdl (not implemented with eval regions), combined (cross-entropy + dice) |
|
|
128 |
|
|
|
129 |
```ini |
|
|
130 |
loss: gdl |
|
|
131 |
eval_regions: false |
|
|
132 |
``` |
|
|
133 |
|
|
|
134 |
### Inference |
|
|
135 |
|
|
|
136 |
Run as: |
|
|
137 |
``` |
|
|
138 |
python inference.py resources/config.ini |
|
|
139 |
``` |
|
|
140 |
|
|
|
141 |
#### Segmentation |
|
|
142 |
|
|
|
143 |
```ini |
|
|
144 |
[basics] |
|
|
145 |
train_flag: false |
|
|
146 |
compute_patches: false |
|
|
147 |
resume: false |
|
|
148 |
|
|
|
149 |
test_flag: true |
|
|
150 |
uncertainty_flag: false |
|
|
151 |
``` |
|
|
152 |
|
|
|
153 |
#### Uncertainty |
|
|
154 |
3 Types of uncertainty can be computed: |
|
|
155 |
* aleatoric `uncertainty_type: tta` and `use_dropout: false` |
|
|
156 |
* epistemic `uncertainty_type: ttd` |
|
|
157 |
* both: `uncertainty_type: tta` and `use_dropout: true` |
|
|
158 |
|
|
|
159 |
```ini |
|
|
160 |
[basics] |
|
|
161 |
train_flag: false |
|
|
162 |
compute_patches: false |
|
|
163 |
resume: false |
|
|
164 |
|
|
|
165 |
test_flag: true |
|
|
166 |
uncertainty_flag: true |
|
|
167 |
|
|
|
168 |
[uncertainty] |
|
|
169 |
n_iterations: 20 |
|
|
170 |
uncertainty_type: tta |
|
|
171 |
use_dropout: false (used if uncertainty_type=tta) |
|
|
172 |
``` |
|
|
173 |
|
|
|
174 |
## Model results |
|
|
175 |
|
|
|
176 |
### Task 1: Segmentation |
|
|
177 |
|
|
|
178 |
| METHOD | DICE WT | DICE TC | DICE ET | HAUSDORFF WT | HAUSDORFF TC | HAUSDORFF ET| |
|
|
179 |
| ------- | ---- | ---------| -------| --------- | --------- | --------- | |
|
|
180 |
| Basic V-Net | 0.8360 | 0.7499 | 0.6159 | 26.4085 | 13.3398 | 49.7425 | |
|
|
181 |
| Basic V-Net + post | 0.8463 | 0.7526 | 0.6179 | 20.4073 | 12.1752 | 47.7020 | |
|
|
182 |
| Deeper V-Net | 0,8571 | 0,7755 | 0,6866 | 16,0270 | 17,6447 | 44,0950| |
|
|
183 |
| Deeper V-Net + post | 0,8611 | 0,7790 | 0,6897 | 14,4988 | 16,1533 | 43,5184| |
|
|
184 |
| Basic 3D-UNet | 0,8411 | 0,7906 | 0,6876 | 13,3658 | 13,6065 | 50,9828| |
|
|
185 |
| Basic 3D-UNet +post | 0,8052 | 0,7749 | 0,6742 | 13,0969 | 14,0047 | 43,8928| |
|
|
186 |
| Residual 3D-UNet | 0,8072 | 0,7740 | 0,6955 | 16,9635 | 17,5142 | 39,9172| |
|
|
187 |
| Residual 3D-UNet + post | 0,8142 | 0,7748 | 0,7119 | 11,8505 | 18,8146 | 34,9652| |
|
|
188 |
| Residual 3D-UNet-multiscale | 0,8172 | 0,7664 | 0,7071 | 15,5342 | 13,9380 | 38,6098| |
|
|
189 |
| Residual 3D-UNet-multiscale + post | 0,8246 | 0,7647 | 0,7163 | 12,3372 | 13,1045 | 37,4224| |
|
|
190 |
| Ensemble mean | 0,8317 | 0,7874 | 0,6951 | 13,4655 | 12,9562 | 47,5703| |
|
|
191 |
| Ensemble mean + post | 0,8367 | 0,7885 | 0,7194 | 10,9320 | 12,2427 | 37,9678| |
|
|
192 |
| Ensemble majority | 0,8223 | 0,7801 | 0,7003 | 10,9781 | 12,6571 | 41,8566| |
|
|
193 |
| Ensemble majority post | 0,8242 | 0,7801 | 0,7003 | 10,0768 | 14,6322 | 46,6045 | |
|
|
194 |
|
|
|
195 |
|
|
|
196 |
|
|
|
197 |
### Task 3: Uncertainty |
|
|
198 |
|
|
|
199 |
| MEASURE | METHOD | AUC DICE WT | AUC DICE TC |AUC DICE ET | FTP RATIO WT | FTP RATIO TC | FTP RATIO ET| FTN RATIO WT | FTM RATIO TC | FTN RATIO ET | |
|
|
200 |
| ---------| -------| --------- | ------------| ---------- | --------- | ---------- | --------- | --------- | ---------- | --------- | |
|
|
201 |
|Variance | TTA Residual 3D-UNet-multiscale | 0,8316 | 0,7715 | 0,7088 | 0,0538 | 0,0449 | 0,0380 | 0,0009 | 0,0002 | 0,0001 | |
|
|
202 |
|Variance | TTD Residual 3D-UNet-multiscale | 0,8300 | 0,7582 | 0,7318 | 0,1646 | 0,1558 | 0,0937 | 0,0024 | 0,0015 | 0,0004 | |
|
|
203 |
|Variance | TTA + TTD Residual 3D-UNet-multiscale | 0,8325 | 0,7632 | 0,7276 | 0,1812 | 0,1588 | 0,0998 | 0,0036 | 0,0020 | 0,0005 | |
|
|
204 |
|Entropy | TTA Residual 3D-UNet-multiscale | 0,8326 | 0,7816 | 0,7138 | 0,0635 | 0,0476 | 0,0362 | 0,0011 | 0,0047 |0,0063 | |
|
|
205 |
|Entropy | TTD Residual 3D-UNet-multiscale | 0,8233 | 0,7797 | 0,7423 | 0,1512 | 0,1285 | 0,0698 | 0,0021 | 0,0082 | 0,0122 | |
|
|
206 |
|Entropy | TTA + TTD Residual 3D-UNet-multiscale |0,8343 | 0,7909 | 0,7710 | 0,1525 | 0,1213 | 0,0664 | 0,0030 | 0,0101 | 0,0139 | |