|
a |
|
b/README.md |
|
|
1 |
# DeepNeuronSeg |
|
|
2 |
DeepNeuronSeg is a full-stack, end-to-end machine learning pipeline designed for neuroimaging data analysis. This robust framework streamlines the entire workflow, from data preprocessing and augmentation to advanced neural network-based denoising and segmentation. With a focus on performance and ease of use, DeepNeuronSeg empowers researchers to efficiently analyze complex neuroimaging datasets and derive meaningful insights with minimal overhead and lightning fast speeds. |
|
|
3 |
|
|
|
4 |
|
|
|
5 |
# Installation Guide |
|
|
6 |
|
|
|
7 |
## Build from conda env (Recommended) |
|
|
8 |
|
|
|
9 |
- Installation requirements |
|
|
10 |
- Python |
|
|
11 |
- Conda |
|
|
12 |
- Git |
|
|
13 |
- Download the [DeepNeuronSeg.yaml](https://github.com/josh-segal/DeepNeuronSeg/blob/main/DEEPNEURONSEG.yaml) file anywhere on your computer |
|
|
14 |
- In the terminal, navigate to the folder where the DeepNeuronSeg.yaml file is located |
|
|
15 |
- run the command `conda env create -f DeepNeuronSeg.yaml` to download DeepNeuronSeg |
|
|
16 |
- activate the environment with this command `conda activate DEEPNEURONSEG` |
|
|
17 |
- launch the program with `python -m DeepNeuronSeg` |
|
|
18 |
- relaunch with `python -m DeepNeuronSeg` |
|
|
19 |
- if the environment is deactivated, reactivate with `conda activate DEEPNEURONSEG` and launch with `python -m DeepNeuronSeg` |
|
|
20 |
|
|
|
21 |
## Build From Source |
|
|
22 |
|
|
|
23 |
- Installation requirements |
|
|
24 |
- Python |
|
|
25 |
- Git |
|
|
26 |
- In terminal at desired location write commands: |
|
|
27 |
- `mkdir test_folder` |
|
|
28 |
- makes the desired directory for downloading the project |
|
|
29 |
- `cd test_folder` |
|
|
30 |
- navigates into the desired directory |
|
|
31 |
- `Git clone https://github.com/josh-segal/DeepNeuronSeg.git` |
|
|
32 |
- This downloads a copy of the project to your local computer |
|
|
33 |
- `cd DeepNeuronSeg` |
|
|
34 |
- This navigates into the DeepNeuronSeg project directory |
|
|
35 |
- `python -m venv venv` |
|
|
36 |
- This creates a python virtual environment to download all the dependencies for DeepNeuronSeg without conflict from your local system/downloads |
|
|
37 |
- `venv/Scripts/activate` (Windows) or `source venv/bin/activate` (MacOS) |
|
|
38 |
- This activates the virtual environment |
|
|
39 |
- `pip install -r requirements.txt` |
|
|
40 |
- This installs the dependencies required for DeepNeuronSeg |
|
|
41 |
- `python -m DeepNeuronSeg` |
|
|
42 |
- This launches the DeepNeuronSeg program, start exploring! |
|
|
43 |
- To launch again navigate to DeepNeuronSeg directory and re-activate the virtual environment and use `python -m DeepNeuronSeg` |
|
|
44 |
|
|
|
45 |
# Usage |
|
|
46 |
|
|
|
47 |
## Upload Data |
|
|
48 |
|
|
|
49 |
Upload images by selecting png files from file explorer |
|
|
50 |
|
|
|
51 |
Upload labels in png (binary mask), csv, txt, XML (last 3 from imageJ cell counter download coordinates) |
|
|
52 |
|
|
|
53 |
Option to input project ID, cohort, brain region, image ID |
|
|
54 |
|
|
|
55 |
scroll through images to confirm or select through file selector |
|
|
56 |
|
|
|
57 |
## Label Data |
|
|
58 |
|
|
|
59 |
Display data to load uploaded data |
|
|
60 |
|
|
|
61 |
Click on cells in image to set label |
|
|
62 |
|
|
|
63 |
Right click to remove cells |
|
|
64 |
|
|
|
65 |
Next Image to navigate over data |
|
|
66 |
|
|
|
67 |
## Generate Labels |
|
|
68 |
|
|
|
69 |
Generate Labels to pass images and labels to label generator |
|
|
70 |
|
|
|
71 |
Next image to scroll through data |
|
|
72 |
|
|
|
73 |
Display Labels to display on startup |
|
|
74 |
|
|
|
75 |
## Create Dataset |
|
|
76 |
|
|
|
77 |
Train Split to set amount of data to train on, remainder to validate on |
|
|
78 |
|
|
|
79 |
Dataset Name to set name of dataset |
|
|
80 |
|
|
|
81 |
File selector to select which files you want to include in your dataset |
|
|
82 |
|
|
|
83 |
## Train Network |
|
|
84 |
|
|
|
85 |
Choose base model to train on |
|
|
86 |
Choose dataset to train with |
|
|
87 |
|
|
|
88 |
Set Epochs, batch size for training |
|
|
89 |
|
|
|
90 |
Choose trained model name |
|
|
91 |
|
|
|
92 |
Choose to train custom denoise model, use default denoise model, or no denoise model |
|
|
93 |
|
|
|
94 |
Use default dataset augmentation, no dataset augmentation, or custom dataset augmentation |
|
|
95 |
|
|
|
96 |
## Evaluate Network |
|
|
97 |
|
|
|
98 |
Choose trained model to evaluate |
|
|
99 |
|
|
|
100 |
Choose dataset to evaluate |
|
|
101 |
|
|
|
102 |
Calculates average and variability metrics for chosen dataset with chosen model |
|
|
103 |
|
|
|
104 |
Optionally display graph of number of detections and confidence of images in dataset |
|
|
105 |
|
|
|
106 |
Download data to download a CSV of images to raw metrics |
|
|
107 |
|
|
|
108 |
## Analyze Data |
|
|
109 |
|
|
|
110 |
Pass through new data to the model and retrieve resultant average and variability metrics |
|
|
111 |
|
|
|
112 |
Compares to base dataset and computes a overall variance score to determine if data is outlier |
|
|
113 |
|
|
|
114 |
Option to display graph with new data inserted |
|
|
115 |
|
|
|
116 |
Option to save inferences as images with predictions marked |
|
|
117 |
|
|
|
118 |
## Extract Outliers |
|
|
119 |
|
|
|
120 |
Displays data with outlier score above set outlier threshold, user can change threshold manually |
|
|
121 |
|
|
|
122 |
User can validate data or relabel data |
|
|
123 |
|
|
|
124 |
relabel data inserts image and labels into data, user can add to dataset and retrain |
|
|
125 |
|
|
|
126 |
## Model Zoo |
|
|
127 |
|
|
|
128 |
User can choose from any of trained models and inference images |
|
|
129 |
|
|
|
130 |
Displays inferences for user to inspect |
|
|
131 |
|
|
|
132 |
User can save inferences to computer |