|
a |
|
b/README.md |
|
|
1 |
# Hierarchical MRI tumor segmentation with densely connected 3D CNN |
|
|
2 |
|
|
|
3 |
By Lele Chen, Yue Wu, [Adora M. DSouza](https://www.rochester.edu/college/gradstudies/profiles/adora-dsouza.html),Anas Z. Abidin, [Axel W. E. Wismuelle](https://www.urmc.rochester.edu/people/27063859-axel-w-e-wismueller), [Chenliang Xu](https://www.cs.rochester.edu/~cxu22/). |
|
|
4 |
|
|
|
5 |
University of Rochester. |
|
|
6 |
|
|
|
7 |
### Table of Contents |
|
|
8 |
0. [Introduction](#introduction) |
|
|
9 |
0. [Citation](#citation) |
|
|
10 |
0. [Running](#running) |
|
|
11 |
0. [Model](#model) |
|
|
12 |
0. [Disclaimer and known issues](#disclaimer-and-known-issues) |
|
|
13 |
0. [Results](#results) |
|
|
14 |
|
|
|
15 |
### Introduction |
|
|
16 |
|
|
|
17 |
This repository contains the original models (dense24, dense48, no-dense) described in the paper "Hierarchical MRI tumor segmentation with densely connected 3D CNN" (https://arxiv.org/abs/1802.02427). This code can be applied directly in [BTRAS2017](http://braintumorsegmentation.org/). |
|
|
18 |
|
|
|
19 |
 |
|
|
20 |
|
|
|
21 |
|
|
|
22 |
### Citation |
|
|
23 |
|
|
|
24 |
If you use these models or the ideas in your research, please cite: |
|
|
25 |
|
|
|
26 |
@inproceedings{DBLP:conf/miip/ChenWDAWX18, |
|
|
27 |
author = {Lele Chen and |
|
|
28 |
Yue Wu and |
|
|
29 |
Adora M. DSouza and |
|
|
30 |
Anas Z. Abidin and |
|
|
31 |
Axel Wism{\"{u}}ller and |
|
|
32 |
Chenliang Xu}, |
|
|
33 |
title = {{MRI} tumor segmentation with densely connected 3D {CNN}}, |
|
|
34 |
booktitle = {Medical Imaging 2018: Image Processing, Houston, Texas, United States, |
|
|
35 |
10-15 February 2018}, |
|
|
36 |
pages = {105741F}, |
|
|
37 |
year = {2018}, |
|
|
38 |
crossref = {DBLP:conf/miip/2018}, |
|
|
39 |
url = {https://doi.org/10.1117/12.2293394}, |
|
|
40 |
doi = {10.1117/12.2293394}, |
|
|
41 |
timestamp = {Tue, 06 Mar 2018 10:50:01 +0100}, |
|
|
42 |
biburl = {https://dblp.org/rec/bib/conf/miip/ChenWDAWX18}, |
|
|
43 |
bibsource = {dblp computer science bibliography, https://dblp.org} |
|
|
44 |
} |
|
|
45 |
### Running |
|
|
46 |
|
|
|
47 |
|
|
|
48 |
0. Pre-installation:[Tensorflow](https://www.tensorflow.org/install/),[Ants](https://github.com/ANTsX/ANTs),[nibabel](http://nipy.org/nibabel/),[sklearn](http://scikit-learn.org/stable/),[numpy](http://www.numpy.org/) |
|
|
49 |
|
|
|
50 |
0. Download and unzip the training data from [BTRAS2017](http://braintumorsegmentation.org/) |
|
|
51 |
|
|
|
52 |
0. Use N4ITK to correct the data: `python n4correction.py /mnt/disk1/dat/lchen63/spie/Brats17TrainingData/HGG` |
|
|
53 |
0. Train the model: `python train.py` |
|
|
54 |
- `-gpu`: gpu id |
|
|
55 |
- `-bs`: batch size |
|
|
56 |
- `-mn`: model name, 'dense24' or 'dense48' or 'no-dense' or 'dense24_nocorrection' |
|
|
57 |
- `-nc`: [n4ITK bias correction](https://www.ncbi.nlm.nih.gov/pubmed/20378467),True or False |
|
|
58 |
- `-e`: epoch number |
|
|
59 |
- `-r`: data path |
|
|
60 |
- `-sp`: save path/name |
|
|
61 |
- ... |
|
|
62 |
|
|
|
63 |
For example: |
|
|
64 |
`python train.py -bs 2 -gpu 0 -mn dense24 -nc True -sp dense48_correction -e 5 -r /mnt/disk1/dat/lchen63/spie/Brats17TrainingData/HGG` |
|
|
65 |
|
|
|
66 |
0. Test the model: `python test.py` |
|
|
67 |
- `-gpu`: gpu id |
|
|
68 |
- `-m`: model path, the saved model name |
|
|
69 |
- `-mn`: model name, 'dense24' or 'dense48' or 'no-dense' or 'dense24_nocorrection' |
|
|
70 |
- `-nc`: [n4ITK bias correction](https://www.ncbi.nlm.nih.gov/pubmed/20378467), True or False |
|
|
71 |
- `-r`: data path |
|
|
72 |
- ... |
|
|
73 |
|
|
|
74 |
For example: |
|
|
75 |
`python test.py -m Dense24_correction-2 -mn dense24 -gpu 0 -nc True -r /mnt/disk1/dat/lchen63/spie/Brats17TrainingData/HGG` |
|
|
76 |
|
|
|
77 |
|
|
|
78 |
### Model |
|
|
79 |
|
|
|
80 |
0. Hierarchical segmentation |
|
|
81 |
 |
|
|
82 |
|
|
|
83 |
|
|
|
84 |
0. 3D densely connected CNN |
|
|
85 |
|
|
|
86 |
 |
|
|
87 |
|
|
|
88 |
### Disclaimer and known issues |
|
|
89 |
|
|
|
90 |
0. These codes are implmented in Tensorflow |
|
|
91 |
0. In this paper, we only use the glioblastoma (HGG) dataset. |
|
|
92 |
0. I didn't config nipype.interfaces.ants.segmentation. So if you need to use `n4correction.py` code, you need to copy it to the bin directory where antsRegistration etc are located. Then run `python n4correction.py` |
|
|
93 |
0. If you want to train these models using this version of tensorflow without modifications, please notice that: |
|
|
94 |
- You need at lest 12 GB GPU memory. |
|
|
95 |
- There might be some other untested issues. |
|
|
96 |
|
|
|
97 |
|
|
|
98 |
### Results |
|
|
99 |
0. Result visualization : |
|
|
100 |
 |
|
|
101 |
 |
|
|
102 |
|
|
|
103 |
0. Quantitative results: |
|
|
104 |
|
|
|
105 |
model|whole|peritumoral edema (ED)|FGD-enhan. tumor (ET) |
|
|
106 |
:---:|:---:|:---:|:---: |
|
|
107 |
Dense24 |0.74| 0.81| 0.80 |
|
|
108 |
Dense48 | 0.61|0.78|0.79 |
|
|
109 |
no-dense|0.61|0.77|0.78 |
|
|
110 |
dense24+n4correction|0.72|0.83|0.81 |