Diff of /README.md [000000] .. [6fe801]

Switch to unified view

a b/README.md
1
# <h1 align="center">COPDgene CT Registration</h1>
2
3
<h3 align="center">A full lung segmentation, pre-processing and image registration methodologies that were applied on 4DCT DIR-Lab dataset challenge.</h3>
4
5
6
![](./Figures/Segmentation/seg-banner.png)
7
8
9
Table of Contents
10
=================
11
12
<!--ts-->
13
   * [Dataset Structure](#dataset-structure)
14
   * [Dataset Exploration](#dataset-exploration)
15
   * [Setup Guideline](#setup-guideline)
16
   * [Test Inference](#test-inference)
17
<!--te-->
18
19
20
21
Dataset Structure
22
============
23
```
24
.
25
├── .
26
├── dataset
27
├── description.json                     # A JSON file that holds the dataset description.
28
├── train                                # Training data directory. Given all sort of data.
29
    └── copd1
30
        ├── copd1_300_eBH_xyz_r1.txt     # exhale keypoints
31
        ├── copd1_300_iBH_xyz_r1.txt     # inhale keypoints
32
        ├── copd1_eBHCT.img              # raw exhale intensity volume
33
        └── copd1_iBHCT.img              # raw inhale intensity volume
34
    ├── copd2
35
    ├── copd3
36
    └── copd4 
37
└── test                                 # Test data directory. Given only the intensity volumes and inhale keypoints.
38
    ├── copdXX
39
    ├── copdXX
40
    ├── copdXX
41
    └── copdXX
42
```
43
44
Dataset Exploration
45
============
46
To visually explore the raw data given in the challenge, please use the MatlabUtilityPack1_v1 MATLAB files to read and import both raw and keypoint data on MATLAB. Instructions are found in MatlabUtilityPack1.pdf file inside the directory.
47
48
Setup Guideline
49
============
50
As the main objective is to register and transform the labels, it is crutial to follow the steps in order to replicate the best results obtained using our implementation. 
51
52
Firstly, after having the dataset in the correct folder structure, we prepare the inhale keypoint file to match transformix requirements using the following command line. Note that `<<DATASET_SPLIT_PATH>> = dataset/train` when you wish to run it on the training data to obtain the same evaluation results. You can set it to the test split, however, you won't be able to evaluate the test if the exhale (groundtruth) keypoints are not given in a later step.
53
```
54
python prepare_keypoints_transformix.py --dataset_path "<<DATASET_SPLIT_PATH>>" --keypoint_type "inhale"
55
```
56
57
Then, to work on the data, we need to parse the raw files to nifti format using the following command line. This will create the nifti volumes in the same data folder.
58
```
59
python parse_raw.py --dataset_path "<<DATASET_SPLIT_PATH>>"
60
```
61
62
Next is to segment the lung, where it will produce the best segmentation directly from all segmentation masks iterations that were experimented.
63
```
64
python segment.py --dataset_path "<<DATASET_SPLIT_PATH>>"
65
```
66
67
After that, we pre-process the data using the normalization implementation (as it got the best results).
68
```
69
python preprocess.py --dataset_path "<<DATASET_SPLIT_PATH>>" --experiment_name "Normalization"
70
```
71
72
This will create a new folder in your directory called `dataset_processed/Normalization/train/*`, where you will find the same structure as before. It is important to move manually the segmentations and the keypoints txt files to this directory as they are in the `<<DATASET_SPLIT_PATH>>` path, as well as the `description.json` file. Otherwise, the next steps won't work on the processed dataset! We will call `dataset_processed/Normalization/train` as `<<PROCESSED_DATASET_SPLIT_PATH>>`
73
74
75
To create a single experiment, you can run `create_script.py` as below to create the output folder for your experiment as well as the windows .bat file to be called. The output folder will be in the project directory and you can create it as below.
76
```
77
python create_script.py --dataset_path "<<PROCESSED_DATASET_SPLIT_PATH>>" --experiment_name "Normalization+UseMasks3+SingleParamFile" --parameters_path "elastix-parameters/Par0003/Par0003.bs-R6-ug.txt" --use_masks
78
```
79
80
Note that you can pass a directory `elastix-parameters/Par0003` instead of a single text file, and elastix will register using all of the files as in order with multiple `-p` flags. 
81
82
Inside the output folder of the experiment, you will find the command to call the created bat file.
83
```
84
call output\Normalization+UseMasks3+SingleParamFile\Par0003.bs-R6-ug\elastix_transformix.bat 
85
```
86
87
To evaluate and create transformation points submission file
88
Use `--generate_report` when gt (exhale) points exist. This will create the transformation points file and log the results
89
```
90
python evaluate_transformation.py --experiment_name "Normalization+UseMasks3+SingleParamFile" --reg_params_key "Par0003.bs-R6-ug" --dataset_path "<<PROCESSED_DATASET_SPLIT_PATH>>"  --generate_report 
91
```
92
93
If the gt (exhale) points are not given, use the same command without `--generate_report` 
94
```
95
python evaluate_transformation.py --experiment_name "Normalization+UseMasks3+SingleParamFile" --reg_params_key "Par0003.bs-R6-ug" --dataset_path "<<PROCESSED_DATASET_SPLIT_PATH>>"
96
```
97
98
99
Test Inference
100
============
101
To run the inference on a dataset, assuming the test, please make sure to have the subjects folders in the same directory structure. Assuming you have the folders inside `dataset/tests/*`. Please run the following commands as in order.
102
103
Make sure to run the `prepare_keypoints_transformix` only once.
104
```
105
python prepare_keypoints_transformix.py --dataset_path "dataset/test" --keypoint_type "inhale"
106
```
107
```
108
python parse_raw.py --dataset_path "dataset/test"
109
```
110
```
111
python segment.py --dataset_path "dataset/test"
112
```
113
Make sure to copy the landmarks, the lung masks (segmentation), or the dataset json to the output folder after pre-processing.
114
```
115
python preprocess.py --dataset_path "dataset/test" --experiment_name "Normalization"
116
```
117
```
118
python create_script.py --dataset_path "dataset_processed/Normalization/test" --experiment_name "TEST-ALL" --parameters_path "elastix-parameters/ParCOPDBest/Par0003.bs-R6-ug-5000SpatialSamples-3000itr.txt" --use_masks
119
```
120
Then call the batch file for the above script. The, evaluate below to create the submission files (without `--generate_report` as we don't have ground truth).
121
```
122
python evaluate_transformation.py --experiment_name "TEST-ALL" --reg_params_key "Par0003.bs-R6-ug-5000SpatialSamples-3000itr" --dataset_path "dataset_processed/Normalization/test"
123
```