Switch to unified view

a/README.md b/README.md
1
<div align="center">    
1
<div align="center">    
2
2
3
![Logo](./src/neuralcvd_logo.png?raw=true "Logo")
3
4
5
**Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort**
4
**Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort**
6
5
7
[![Paper](https://img.shields.io/badge/TheLancet-DigitalHealth-informational)](https://www.thelancet.com/journals/landig/article/PIIS2589-7500(21)00249-1/fulltext)
6
[![Paper](https://img.shields.io/badge/TheLancet-DigitalHealth-informational)](https://www.thelancet.com/journals/landig/article/PIIS2589-7500(21)00249-1/fulltext)
8
7
9
</div>
8
</div>
10
 
9
 
11
## Description   
10
## Description   
12
Code related to the paper "Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort". 
11
Code related to the paper "Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort". 
13
This repo is a python package for preprocessing UK Biobank data and preprocessing, training and evaluating the NeuralCVD score.
12
This repo is a python package for preprocessing UK Biobank data and preprocessing, training and evaluating the NeuralCVD score.
14
13
15
![NeuralCVD](./src/neuralcvd_fig1.png?raw=true "NeuralCVD")
14
16
17
## Methods
15
## Methods
18
**NeuralCVD** is based on the fantastic [Deep Survival Machines](https://arxiv.org/abs/2003.01176) Paper, the original implementation can be found [here](https://github.com/autonlab/DeepSurvivalMachines).
16
**NeuralCVD** is based on the fantastic [Deep Survival Machines](https://arxiv.org/abs/2003.01176) Paper, the original implementation can be found [here](https://github.com/autonlab/DeepSurvivalMachines).
19
17
20
## Assets
18
## Assets
21
This repo contains code to preprocess [UK Biobank](https://www.ukbiobank.ac.uk/) data, train the NeuralCVD score and analyze/evaluate its performance.
19
This repo contains code to preprocess [UK Biobank](https://www.ukbiobank.ac.uk/) data, train the NeuralCVD score and analyze/evaluate its performance.
22
20
23
- Preprocessing involves: parsing primary care records for desired diagnosis, aggregating the cardiovascular risk factors analyzed in the study and calculating predefined polygenic risk scores.
21
- Preprocessing involves: parsing primary care records for desired diagnosis, aggregating the cardiovascular risk factors analyzed in the study and calculating predefined polygenic risk scores.
24
- Training involves Model specification via pytorch-lightning and hydra.
22
- Training involves Model specification via pytorch-lightning and hydra.
25
- Postprocessing involve extensive benchmarks with linear Models, and calculation of bootstrapped metrics.
23
- Postprocessing involve extensive benchmarks with linear Models, and calculation of bootstrapped metrics.
26
24
27
25
28
## How to train the NeuralCVD Model  
26
## How to train the NeuralCVD Model  
29
1. First, install dependencies   
27
1. First, install dependencies   
30
```bash
28
```bash
31
# clone project   
29
# clone project   
32
git clone https://github.com/thbuerg/NeuralCVD
30
git clone https://github.com/thbuerg/NeuralCVD
33
31
34
# install project   
32
# install project   
35
cd NeuralCVD
33
cd NeuralCVD
36
pip install -e .   
34
pip install -e .   
37
pip install -r requirements.txt
35
pip install -r requirements.txt
38
 ```   
36
 ```   
39
37
40
2. Download UK Biobank data. Execute preprocessing notebooks on the downloaded data.
38
2. Download UK Biobank data. Execute preprocessing notebooks on the downloaded data.
41
39
42
3. Edit the `.yaml` config files in `neuralcvd/experiments/config/`:
40
3. Edit the `.yaml` config files in `neuralcvd/experiments/config/`:
43
```yaml
41
```yaml
44
setup:
42
setup:
45
  project_name: <YourNeptuneSpace>/<YourProject>
43
  project_name: <YourNeptuneSpace>/<YourProject>
46
  root_dir: absolute/path/to/this/repo/
44
  root_dir: absolute/path/to/this/repo/
47
experiment:
45
experiment:
48
  tabular_filepath: path/to/processed/data
46
  tabular_filepath: path/to/processed/data
49
```
47
```
50
48
51
4. Set up [Neptune.ai](https://www.neptune.ai)
49
4. Set up [Neptune.ai](https://www.neptune.ai)
52
50
53
5. Train the NeuralCVD Model (make sure you are on a machine w/ GPU)
51
5. Train the NeuralCVD Model (make sure you are on a machine w/ GPU)
54
 ```bash
52
 ```bash
55
# module folder
53
# module folder
56
cd neuralcvd
54
cd neuralcvd
57
55
58
# run training
56
# run training
59
bash experiments/run_NeuralCVD_S.sh
57
bash experiments/run_NeuralCVD_S.sh
60
```
58
```
61
59
62
## Citation   
60
## Citation   
63
```
61
```
64
@article{steinfeldt2022neural,
62
@article{steinfeldt2022neural,
65
  title={Neural network-based integration of polygenic and clinical information: development and validation of a prediction model for 10-year risk of major adverse cardiac events in the UK Biobank cohort},
63
  title={Neural network-based integration of polygenic and clinical information: development and validation of a prediction model for 10-year risk of major adverse cardiac events in the UK Biobank cohort},
66
  author={Steinfeldt, Jakob and Buergel, Thore and Loock, Lukas and Kittner, Paul and Ruyoga, Greg and zu Belzen, Julius Upmeier and Sasse, Simon and Strangalies, Henrik and Christmann, Lara and Hollmann, Noah and others},
64
  author={Steinfeldt, Jakob and Buergel, Thore and Loock, Lukas and Kittner, Paul and Ruyoga, Greg and zu Belzen, Julius Upmeier and Sasse, Simon and Strangalies, Henrik and Christmann, Lara and Hollmann, Noah and others},
67
  journal={The Lancet Digital Health},
65
  journal={The Lancet Digital Health},
68
  volume={4},
66
  volume={4},
69
  number={2},
67
  number={2},
70
  pages={e84--e94},
68
  pages={e84--e94},
71
  year={2022},
69
  year={2022},
72
  publisher={Elsevier}
70
  publisher={Elsevier}
73
}
71
}
74
```  
72
```