[53737a]: / docs / _build / doctrees / README.doctree

Download this file

372 lines (320 with data), 46.6 kB

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
€cdocutils.nodes
document
q)q}q(U	nametypesq}q(Xexample datasets and scriptsqNXrequirementsqNXdistributed computationqNX8creating a deepprog model using an ensemble of submodelsq	NXCcreating a simple deepprog model with one autoencoder for each omicq
NX#visualisation module (experimental)qNXDcreating a distributed deepprog model using an ensemble of submodelsqNXcontact and credentialsq
NXinstallation (local)qNXusageqNXsupport for cntk / tensorflowqNXexample scriptsqNXBsurvival integration of multi-omics using deep-learning (deepprog)qNuUsubstitution_defsq}qUparse_messagesq]qUcurrent_sourceqNU
decorationqNUautofootnote_startqKUnameidsq}q(hUexample-datasets-and-scriptsqhUrequirementsqhUdistributed-computationqh	U8creating-a-deepprog-model-using-an-ensemble-of-submodelsqh
UCcreating-a-simple-deepprog-model-with-one-autoencoder-for-each-omicq hU!visualisation-module-experimentalq!hUDcreating-a-distributed-deepprog-model-using-an-ensemble-of-submodelsq"h
Ucontact-and-credentialsq#hUinstallation-localq$hUusageq%hUsupport-for-cntk-tensorflowq&hUexample-scriptsq'hU@survival-integration-of-multi-omics-using-deep-learning-deepprogq(uUchildrenq)]q*cdocutils.nodes
section
q+)q,}q-(U	rawsourceq.UUparentq/hUsourceq0X(/home/oliver/code/SimDeep/docs/README.mdq1Utagnameq2Usectionq3U
attributesq4}q5(Udupnamesq6]Uclassesq7]Ubackrefsq8]Uidsq9]q:h(aUnamesq;]q<hauUlineq=KUdocumentq>hh)]q?(cdocutils.nodes
title
q@)qA}qB(h.XBSurvival Integration of Multi-omics using Deep-Learning (DeepProg)h/h,h0h1h2UtitleqCh4}qD(h6]h7]h8]h9]h;]uh=Kh)]qEcdocutils.nodes
Text
qFXBSurvival Integration of Multi-omics using Deep-Learning (DeepProg)qG…qH}qI(h.XBSurvival Integration of Multi-omics using Deep-Learning (DeepProg)qJh/hAubaubcdocutils.nodes
paragraph
qK)qL}qM(h.XQThis package allows to combine multi-omics data together with survival. Using autoencoders, the pipeline creates new features and identify those linked with survival, using CoxPH regression.
The omic data used in the original study are RNA-Seq, MiR and Methylation. However, this approach can be extended to any combination of omic data.h/h,h0h1h2U	paragraphqNh4}qO(h6]h7]h8]h9]h;]uh=Kh>hh)]qP(hFX¾This package allows to combine multi-omics data together with survival. Using autoencoders, the pipeline creates new features and identify those linked with survival, using CoxPH regression.qQ…qR}qS(h.X¾This package allows to combine multi-omics data together with survival. Using autoencoders, the pipeline creates new features and identify those linked with survival, using CoxPH regression.qTh0Nh=Nh>hh/hLubhFX
…qU}qV(h.Uh0Nh=Nh>hh/hLubhFX’The omic data used in the original study are RNA-Seq, MiR and Methylation. However, this approach can be extended to any combination of omic data.qW…qX}qY(h.X’The omic data used in the original study are RNA-Seq, MiR and Methylation. However, this approach can be extended to any combination of omic data.qZh0Nh=Nh>hh/hLubeubhK)q[}q\(h.XŽThe current package contains the omic data used in the study and a copy of the model computed. However, it is very easy to recreate a new model from scratch using any combination of omic data.
The omic data and the survival files should be in tsv (Tabular Separated Values) format and examples are provided. The deep-learning framework uses Keras, which is a embedding of Theano / tensorflow/ CNTK.h/h,h0h1h2hNh4}q](h6]h7]h8]h9]h;]uh=Kh>hh)]q^(hFXÀThe current package contains the omic data used in the study and a copy of the model computed. However, it is very easy to recreate a new model from scratch using any combination of omic data.q_…q`}qa(h.XÀThe current package contains the omic data used in the study and a copy of the model computed. However, it is very easy to recreate a new model from scratch using any combination of omic data.qbh0Nh=Nh>hh/h[ubhFX
…qc}qd(h.Uh0Nh=Nh>hh/h[ubhFXÍThe omic data and the survival files should be in tsv (Tabular Separated Values) format and examples are provided. The deep-learning framework uses Keras, which is a embedding of Theano / tensorflow/ CNTK.qe…qf}qg(h.XÍThe omic data and the survival files should be in tsv (Tabular Separated Values) format and examples are provided. The deep-learning framework uses Keras, which is a embedding of Theano / tensorflow/ CNTK.qhh0Nh=Nh>hh/h[ubeubh+)qi}qj(h.Uh/h,h0h1h2h3h4}qk(h6]h7]h8]h9]qlhah;]qmhauh=K
h>hh)]qn(h@)qo}qp(h.XRequirementsh/hih0h1h2hCh4}qq(h6]h7]h8]h9]h;]uh=K
h)]qrhFXRequirementsqs…qt}qu(h.XRequirementsqvh/houbaubcdocutils.nodes
bullet_list
qw)qx}qy(h.Uh/hih0h1h2Ubullet_listqzh4}q{(h6]h7]h8]h9]h;]uh=Kh>hh)]q|(cdocutils.nodes
list_item
q})q~}q(h.Uh/hxh0h1h2U	list_itemq€h4}q(h6]h7]h8]h9]h;]uh=Kh>hh)]q‚hK)qƒ}q„(h.X
Python 2 or 3h/h~h0h1h2hNh4}q…(h6]h7]h8]h9]h;]uh=Kh>hh)]q†hFX
Python 2 or 3q‡…qˆ}q‰(h.X
Python 2 or 3qŠh0Nh=Nh>hh/hƒubaubaubh})q‹}qŒ(h.Uh/hxh0h1h2h€h4}q(h6]h7]h8]h9]h;]uh=Kh>hh)]qŽhK)q}q(h.X6theano (the used version for the manuscript was 0.8.2)h/h‹h0h1h2hNh4}q‘(h6]h7]h8]h9]h;]uh=Kh>hh)]q’(cdocutils.nodes
reference
q“)q”}q•(h.Xtheanoh/hh0h1h2U	referenceq–h4}q—(Urefuriq˜U4http://deeplearning.net/software/theano/install.htmlq™h9]h8]h6]h7]h;]uh=Kh>hh)]qšhFXtheanoq›…qœ}q(h.Xtheanoqžh0Nh=Nh>hh/h”ubaubhFX0 (the used version for the manuscript was 0.8.2)qŸ…q }q¡(h.X0 (the used version for the manuscript was 0.8.2)q¢h0Nh=Nh>hh/hubeubaubh})q£}q¤(h.Uh/hxh0h1h2h€h4}q¥(h6]h7]h8]h9]h;]uh=K
h>hh)]q¦hK)q§}q¨(h.X1tensorflow as a more robust alternative to theanoh/h£h0h1h2hNh4}q©(h6]h7]h8]h9]h;]uh=K
h>hh)]qª(h“)q«}q¬(h.X
tensorflowh/h§h0h1h2h–h4}q­(h˜Uhttps://www.tensorflow.org/q®h9]h8]h6]h7]h;]uh=K
h>hh)]q¯hFX
tensorflowq°…q±}q²(h.X
tensorflowq³h0Nh=Nh>hh/h«ubaubhFX' as a more robust alternative to theanoq´…qµ}q¶(h.X' as a more robust alternative to theanoq·h0Nh=Nh>hh/h§ubeubaubh})q¸}q¹(h.Uh/hxh0h1h2h€h4}qº(h6]h7]h8]h9]h;]uh=Kh>hh)]q»hK)q¼}q½(h.X™cntk CNTK is anoter DL library that can present some advantages compared to tensorflow or theano. See https://docs.microsoft.com/en-us/cognitive-toolkit/h/h¸h0h1h2hNh4}q¾(h6]h7]h8]h9]h;]uh=Kh>hh)]q¿(h“)qÀ}qÁ(h.Xcntkh/h¼h0h1h2h–h4}qÂ(h˜U!https://github.com/microsoft/CNTKqÃh9]h8]h6]h7]h;]uh=Kh>hh)]qÄhFXcntkqҁqÆ}qÇ(h.XcntkqÈh0Nh=Nh>hh/hÀubaubhFXb CNTK is anoter DL library that can present some advantages compared to tensorflow or theano. See qɅqÊ}qË(h.Xb CNTK is anoter DL library that can present some advantages compared to tensorflow or theano. See qÌh0Nh=Nh>hh/h¼ubh“)qÍ}qÎ(h.X3https://docs.microsoft.com/en-us/cognitive-toolkit/h/h¼h0h1h2h–h4}qÏ(h˜U3https://docs.microsoft.com/en-us/cognitive-toolkit/qÐh9]h8]h6]h7]h;]uh=Kh>hh)]qÑhFX3https://docs.microsoft.com/en-us/cognitive-toolkit/q҅qÓ}qÔ(h.X3https://docs.microsoft.com/en-us/cognitive-toolkit/qÕh0Nh=Nh>hh/hÍubaubeubaubh})qÖ}q×(h.Uh/hxh0h1h2h€h4}qØ(h6]h7]h8]h9]h;]uh=Kh>hh)]qÙhK)qÚ}qÛ(h.XRh/hÖh0h1h2hNh4}qÜ(h6]h7]h8]h9]h;]uh=Kh>hh)]qÝhFXR…qÞ}qß(h.XRh0Nh=Nh>hh/hÚubaubaubh})qà}qá(h.Uh/hxh0h1h2h€h4}qâ(h6]h7]h8]h9]h;]uh=Kh>hh)]qãhK)qä}qå(h.X#the R "survival" package installed.h/hàh0h1h2hNh4}qæ(h6]h7]h8]h9]h;]uh=Kh>hh)]qç(hFXthe R q腁qé}qê(h.Xthe R qëh0Nh=Nh>hh/häubhFX“…qì}qí(h.X"h0Nh=Nh>hh/häubhFXsurvivalqqï}qð(h.Xsurvivalqñh0Nh=Nh>hh/häubhFX”…qò}qó(h.X"h0Nh=Nh>hh/häubhFX package installed.qô…qõ}qö(h.X package installed.q÷h0Nh=Nh>hh/häubeubaubh})qø}qù(h.Uh/hxh0h1h2h€h4}qú(h6]h7]h8]h9]h;]uh=Kh>hh)]qûhK)qü}qý(h.Xnumpy, scipyh/høh0h1h2hNh4}qþ(h6]h7]h8]h9]h;]uh=Kh>hh)]qÿhFXnumpy, scipyr…r}r(h.Xnumpy, scipyrh0Nh=Nh>hh/hüubaubaubh})r}r(h.Uh/hxh0h1h2h€h4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhK)r}r	(h.Xscikit-learn (>=0.18)h/jh0h1h2hNh4}r
(h6]h7]h8]h9]h;]uh=Kh>hh)]rhFXscikit-learn (>=0.18)rr
}r(h.Xscikit-learn (>=0.18)rh0Nh=Nh>hh/jubaubaubh})r}r(h.Uh/hxh0h1h2h€h4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhK)r}r(h.X¾rpy2 2.8.6 (for python2 rpy2 can be install with: pip install rpy2==2.8.6, for python3 pip3 install rpy2==2.8.6). It seems that newer version of rpy2 might not work due to a bug (not tested)h/jh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhFX¾rpy2 2.8.6 (for python2 rpy2 can be install with: pip install rpy2==2.8.6, for python3 pip3 install rpy2==2.8.6). It seems that newer version of rpy2 might not work due to a bug (not tested)r…r}r(h.X¾rpy2 2.8.6 (for python2 rpy2 can be install with: pip install rpy2==2.8.6, for python3 pip3 install rpy2==2.8.6). It seems that newer version of rpy2 might not work due to a bug (not tested)rh0Nh=Nh>hh/jubaubaubeubcdocutils.nodes
literal_block
r)r}r(h.Xÿpip install theano --user # Original backend used OR
pip install tensorflow --user # Alternative backend for keras supposely for efficient
pip install keras --user
pip install rpy2==2.8.6 --user

#If you want to use theano or CNTK
nano ~/.keras/keras.jsonh/hih0h1h2U
literal_blockrh4}r (UlanguageXbashr!U	xml:spacer"Upreserver#h9]h8]h6]h7]h;]uh=Kh>hh)]r$hFXÿpip install theano --user # Original backend used OR
pip install tensorflow --user # Alternative backend for keras supposely for efficient
pip install keras --user
pip install rpy2==2.8.6 --user

#If you want to use theano or CNTK
nano ~/.keras/keras.jsonr%…r&}r'(h.Uh/jubaubhw)r(}r)(h.Uh/hih0h1h2hzh4}r*(h6]h7]h8]h9]h;]uh=K h>hh)]r+h})r,}r-(h.Uh/j(h0h1h2h€h4}r.(h6]h7]h8]h9]h;]uh=K h>hh)]r/hK)r0}r1(h.XR installationh/j,h0h1h2hNh4}r2(h6]h7]h8]h9]h;]uh=K h>hh)]r3hFXR installationr4…r5}r6(h.XR installationr7h0Nh=Nh>hh/j0ubaubaubaubj)r8}r9(h.Xxinstall.package("survival")
install.package("glmnet")
source("https://bioconductor.org/biocLite.R")
biocLite("survcomp")h/hih0h1h2jh4}r:(UlanguageXRj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]r;hFXxinstall.package("survival")
install.package("glmnet")
source("https://bioconductor.org/biocLite.R")
biocLite("survcomp")r<…r=}r>(h.Uh/j8ubaubh+)r?}r@(h.Uh/hih0h1h2h3h4}rA(h6]h7]h8]h9]rBh&ah;]rChauh=K*h>hh)]rD(h@)rE}rF(h.XSupport for CNTK / tensorflowh/j?h0h1h2hCh4}rG(h6]h7]h8]h9]h;]uh=K*h)]rHhFXSupport for CNTK / tensorflowrI…rJ}rK(h.XSupport for CNTK / tensorflowrLh/jEubaubhw)rM}rN(h.Uh/j?h0h1h2hzh4}rO(h6]h7]h8]h9]h;]uh=K+h>hh)]rPh})rQ}rR(h.Uh/jMh0h1h2h€h4}rS(h6]h7]h8]h9]h;]uh=K+h>hh)]rThK)rU}rV(h.XgWe originally used Keras with theano as backend plateform. However, Tensorflow or CNTK are more recent DL framework that can be faster or more stable than theano. Because keras supports these 3 backends, it is possible to use them as alternative to theano. To change backend, please configure the $HOME/.keras/keras.json file. (See official instruction here).h/jQh0h1h2hNh4}rW(h6]h7]h8]h9]h;]uh=K+h>hh)]rX(hFXDWe originally used Keras with theano as backend plateform. However, rY…rZ}r[(h.XDWe originally used Keras with theano as backend plateform. However, r\h0Nh=Nh>hh/jUubh“)r]}r^(h.X
Tensorflowh/jUh0h1h2h–h4}r_(h˜Uhttps://www.tensorflow.org/r`h9]h8]h6]h7]h;]uh=K+h>hh)]rahFX
Tensorflowrb…rc}rd(h.X
Tensorflowreh0Nh=Nh>hh/j]ubaubhFX or rf…rg}rh(h.X or rih0Nh=Nh>hh/jUubh“)rj}rk(h.XCNTKh/jUh0h1h2h–h4}rl(h˜U3https://docs.microsoft.com/en-us/cognitive-toolkit/rmh9]h8]h6]h7]h;]uh=K+h>hh)]rnhFXCNTKro…rp}rq(h.XCNTKrrh0Nh=Nh>hh/jjubaubhFXÓ are more recent DL framework that can be faster or more stable than theano. Because keras supports these 3 backends, it is possible to use them as alternative to theano. To change backend, please configure the rs…rt}ru(h.XÓ are more recent DL framework that can be faster or more stable than theano. Because keras supports these 3 backends, it is possible to use them as alternative to theano. To change backend, please configure the rvh0Nh=Nh>hh/jUubcdocutils.nodes
literal
rw)rx}ry(h.X$HOME/.keras/keras.jsonrzh/jUh0h1h2Uliteralr{h4}r|(h6]h7]h8]h9]h;]uh=Kh>hh)]r}hFX$HOME/.keras/keras.jsonr~…r}r€(h.Uh0Nh=Nh>hh/jxubaubhFX! file. (See official instruction r…r‚}rƒ(h.X! file. (See official instruction r„h0Nh=Nh>hh/jUubh“)r…}r†(h.Xhereh/jUh0h1h2h–h4}r‡(h˜Uhttps://keras.io/backend/rˆh9]h8]h6]h7]h;]uh=K+h>hh)]r‰hFXhererŠ…r‹}rŒ(h.Xhererh0Nh=Nh>hh/j…ubaubhFX).rŽ…r}r(h.X).r‘h0Nh=Nh>hh/jUubeubaubaubhK)r’}r“(h.X/The default configuration file looks like this:h/j?h0h1h2hNh4}r”(h6]h7]h8]h9]h;]uh=K-h>hh)]r•hFX/The default configuration file looks like this:r–…r—}r˜(h.X/The default configuration file looks like this:r™h0Nh=Nh>hh/j’ubaubj)rš}r›(h.Xx{
    "image_data_format": "channels_last",
    "epsilon": 1e-07,
    "floatx": "float32",
    "backend": "tensorflow"
}h/j?h0h1h2jh4}rœ(UlanguageXjsonrj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]ržhFXx{
    "image_data_format": "channels_last",
    "epsilon": 1e-07,
    "floatx": "float32",
    "backend": "tensorflow"
}rŸ…r }r¡(h.Uh/jšubaubeubeubh+)r¢}r£(h.Uh/h,h0h1h2h3h4}r¤(h6]h7]h8]h9]r¥hah;]r¦hauh=K8h>hh)]r§(h@)r¨}r©(h.XDistributed computationh/j¢h0h1h2hCh4}rª(h6]h7]h8]h9]h;]uh=K8h)]r«hFXDistributed computationr¬…r­}r®(h.XDistributed computationr¯h/j¨ubaubhw)r°}r±(h.Uh/j¢h0h1h2hzh4}r²(h6]h7]h8]h9]h;]uh=K9h>hh)]r³(h})r´}rµ(h.Uh/j°h0h1h2h€h4}r¶(h6]h7]h8]h9]h;]uh=K9h>hh)]r·hK)r¸}r¹(h.XÙIt is possible to use the python ray framework https://github.com/ray-project/ray to control the parallel computation of the multiple models. To use this framework, it is required to install it: pip install ray --userh/j´h0h1h2hNh4}rº(h6]h7]h8]h9]h;]uh=K9h>hh)]r»(hFX/It is possible to use the python ray framework r¼…r½}r¾(h.X/It is possible to use the python ray framework r¿h0Nh=Nh>hh/j¸ubh“)rÀ}rÁ(h.X"https://github.com/ray-project/rayh/j¸h0h1h2h–h4}rÂ(h˜U"https://github.com/ray-project/rayrÃh9]h8]h6]h7]h;]uh=K9h>hh)]rÄhFX"https://github.com/ray-project/rayrÅ…rÆ}rÇ(h.X"https://github.com/ray-project/rayrÈh0Nh=Nh>hh/jÀubaubhFXr to control the parallel computation of the multiple models. To use this framework, it is required to install it: rÉ…rÊ}rË(h.Xr to control the parallel computation of the multiple models. To use this framework, it is required to install it: rÌh0Nh=Nh>hh/j¸ubjw)rÍ}rÎ(h.Xpip install ray --userrÏh/j¸h0h1h2j{h4}rÐ(h6]h7]h8]h9]h;]uh=Kh>hh)]rÑhFXpip install ray --userrÒ…rÓ}rÔ(h.Uh0Nh=Nh>hh/jÍubaubeubaubh})rÕ}rÖ(h.Uh/j°h0h1h2h€h4}r×(h6]h7]h8]h9]h;]uh=K:h>hh)]rØhK)rÙ}rÚ(h.XgAlternatively, it is also possible to create the model one by one without the need of the ray frameworkh/jÕh0h1h2hNh4}rÛ(h6]h7]h8]h9]h;]uh=K:h>hh)]rÜhFXgAlternatively, it is also possible to create the model one by one without the need of the ray frameworkrÝ…rÞ}rß(h.XgAlternatively, it is also possible to create the model one by one without the need of the ray frameworkràh0Nh=Nh>hh/jÙubaubaubeubeubh+)rá}râ(h.Uh/h,h0h1h2h3h4}rã(h6]h7]h8]h9]räh!ah;]råhauh=K<h>hh)]ræ(h@)rç}rè(h.X#Visualisation module (Experimental)h/jáh0h1h2hCh4}ré(h6]h7]h8]h9]h;]uh=K<h)]rêhFX#Visualisation module (Experimental)r녁rì}rí(h.X#Visualisation module (Experimental)rîh/jçubaubhw)rï}rð(h.Uh/jáh0h1h2hzh4}rñ(h6]h7]h8]h9]h;]uh=K=h>hh)]rò(h})ró}rô(h.Uh/jïh0h1h2h€h4}rõ(h6]h7]h8]h9]h;]uh=K=h>hh)]röhK)r÷}rø(h.X…To visualise test sets projected into the multi-omic survival space, it is required to install mpld3 module: pip install mpld3 --userh/jóh0h1h2hNh4}rù(h6]h7]h8]h9]h;]uh=K=h>hh)]rú(hFX_To visualise test sets projected into the multi-omic survival space, it is required to install rû…rü}rý(h.X_To visualise test sets projected into the multi-omic survival space, it is required to install rþh0Nh=Nh>hh/j÷ubjw)rÿ}r(h.Xmpld3rh/j÷h0h1h2j{h4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhFXmpld3r…r}r(h.Uh0Nh=Nh>hh/jÿubaubhFX	 module: r…r}r	(h.X	 module: r
h0Nh=Nh>hh/j÷ubjw)r}r(h.Xpip install mpld3 --userr
h/j÷h0h1h2j{h4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhFXpip install mpld3 --userr…r}r(h.Uh0Nh=Nh>hh/jubaubeubaubh})r}r(h.Uh/jïh0h1h2h€h4}r(h6]h7]h8]h9]h;]uh=K>h>hh)]rhK)r}r(h.X#Note that the pip version of mpld3 installed on my computer presented a bug: TypeError: array([1.]) is not JSON serializable . However, the newest version of the mpld3 available from the github solved this issue. It is therefore recommended to install the newest version to avoid this issue.h/jh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=K>h>hh)]r(hFXHNote that the pip version of mpld3 installed on my computer presented a r…r}r(h.XHNote that the pip version of mpld3 installed on my computer presented a rh0Nh=Nh>hh/jubh“)r}r (h.Xbugh/jh0h1h2h–h4}r!(h˜U)https://github.com/mpld3/mpld3/issues/434r"h9]h8]h6]h7]h;]uh=K>h>hh)]r#hFXbugr$…r%}r&(h.Xbugr'h0Nh=Nh>hh/jubaubhFX: r(…r)}r*(h.X: r+h0Nh=Nh>hh/jubjw)r,}r-(h.X0TypeError: array([1.]) is not JSON serializable r.h/jh0h1h2j{h4}r/(h6]h7]h8]h9]h;]uh=Kh>hh)]r0hFX0TypeError: array([1.]) is not JSON serializable r1…r2}r3(h.Uh0Nh=Nh>hh/j,ubaubhFX. However, the r4…r5}r6(h.X. However, the r7h0Nh=Nh>hh/jubh“)r8}r9(h.Xnewesth/jh0h1h2h–h4}r:(h˜Uhttps://github.com/mpld3/mpld3r;h9]h8]h6]h7]h;]uh=K>h>hh)]r<hFXnewestr=…r>}r?(h.Xnewestr@h0Nh=Nh>hh/j8ubaubhFX‘ version of the mpld3 available from the github solved this issue. It is therefore recommended to install the newest version to avoid this issue.rA…rB}rC(h.X‘ version of the mpld3 available from the github solved this issue. It is therefore recommended to install the newest version to avoid this issue.rDh0Nh=Nh>hh/jubeubaubeubeubh+)rE}rF(h.Uh/h,h0h1h2h3h4}rG(h6]h7]h8]h9]rHh$ah;]rIhauh=K@h>hh)]rJ(h@)rK}rL(h.Xinstallation (local)h/jEh0h1h2hCh4}rM(h6]h7]h8]h9]h;]uh=K@h)]rNhFXinstallation (local)rO…rP}rQ(h.Xinstallation (local)rRh/jKubaubj)rS}rT(h.Xfgit clone https://github.com/lanagarmire/SimDeep.git
cd SimDeep
pip install -r requirements.txt --userh/jEh0h1h2jh4}rU(UlanguageXbashrVj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]rWhFXfgit clone https://github.com/lanagarmire/SimDeep.git
cd SimDeep
pip install -r requirements.txt --userrX…rY}rZ(h.Uh/jSubaubeubh+)r[}r\(h.Uh/h,h0h1h2h3h4}r](h6]h7]h8]h9]r^h%ah;]r_hauh=KHh>hh)]r`(h@)ra}rb(h.XUsageh/j[h0h1h2hCh4}rc(h6]h7]h8]h9]h;]uh=KHh)]rdhFXUsagere…rf}rg(h.XUsagerhh/jaubaubhw)ri}rj(h.Uh/j[h0h1h2hzh4}rk(h6]h7]h8]h9]h;]uh=KIh>hh)]rlh})rm}rn(h.Uh/jih0h1h2h€h4}ro(h6]h7]h8]h9]h;]uh=KIh>hh)]rphK)rq}rr(h.XItest if simdeep is functional (all the software are correctly installed):h/jmh0h1h2hNh4}rs(h6]h7]h8]h9]h;]uh=KIh>hh)]rthFXItest if simdeep is functional (all the software are correctly installed):ru…rv}rw(h.XItest if simdeep is functional (all the software are correctly installed):rxh0Nh=Nh>hh/jqubaubaubaubj)ry}rz(h.Xs  python test/test_dummy_boosting_stacking.py -v # OR
  nosetests test -v # Improved version of python unit testingh/j[h0h1h2jh4}r{(UlanguageXbashr|j"j#h9]h8]h6]h7]h;]uh=Kh>hh)]r}hFXs  python test/test_dummy_boosting_stacking.py -v # OR
  nosetests test -v # Improved version of python unit testingr~…r}r€(h.Uh/jyubaubhw)r}r‚(h.Uh/j[h0h1h2hzh4}rƒ(h6]h7]h8]h9]h;]uh=KPh>hh)]r„h})r…}r†(h.Uh/jh0h1h2h€h4}r‡(h6]h7]h8]h9]h;]uh=KPh>hh)]rˆ(hK)r‰}rŠ(h.X˜All the default parameters are defined in the config file: ./simdeep/config.py but can be passed dynamically. Three types of parameters must be defined:h/j…h0h1h2hNh4}r‹(h6]h7]h8]h9]h;]uh=KPh>hh)]rŒ(hFX;All the default parameters are defined in the config file: r…rŽ}r(h.X;All the default parameters are defined in the config file: rh0Nh=Nh>hh/j‰ubjw)r‘}r’(h.X./simdeep/config.pyr“h/j‰h0h1h2j{h4}r”(h6]h7]h8]h9]h;]uh=Kh>hh)]r•hFX./simdeep/config.pyr–…r—}r˜(h.Uh0Nh=Nh>hh/j‘ubaubhFXJ but can be passed dynamically. Three types of parameters must be defined:r™…rš}r›(h.XJ but can be passed dynamically. Three types of parameters must be defined:rœh0Nh=Nh>hh/j‰ubeubhw)r}rž(h.Uh/j…h0h1h2hzh4}rŸ(h6]h7]h8]h9]h;]uh=KQh>hh)]r (h})r¡}r¢(h.Uh/jh0h1h2h€h4}r£(h6]h7]h8]h9]h;]uh=KQh>hh)]r¤(hK)r¥}r¦(h.X3The training dataset (omics + survival input files)h/j¡h0h1h2hNh4}r§(h6]h7]h8]h9]h;]uh=KQh>hh)]r¨hFX3The training dataset (omics + survival input files)r©…rª}r«(h.X3The training dataset (omics + survival input files)r¬h0Nh=Nh>hh/j¥ubaubhw)r­}r®(h.Uh/j¡h0h1h2hzh4}r¯(h6]h7]h8]h9]h;]uh=KRh>hh)]r°h})r±}r²(h.Uh/j­h0h1h2h€h4}r³(h6]h7]h8]h9]h;]uh=KRh>hh)]r´hK)rµ}r¶(h.XXIn addition, the parameters of the test set, i.e. the omic dataset and the survival fileh/j±h0h1h2hNh4}r·(h6]h7]h8]h9]h;]uh=KRh>hh)]r¸hFXXIn addition, the parameters of the test set, i.e. the omic dataset and the survival filer¹…rº}r»(h.XXIn addition, the parameters of the test set, i.e. the omic dataset and the survival filer¼h0Nh=Nh>hh/jµubaubaubaubeubh})r½}r¾(h.Uh/jh0h1h2h€h4}r¿(h6]h7]h8]h9]h;]uh=KSh>hh)]rÀhK)rÁ}rÂ(h.X[The parameters of the autoencoder (the default parameters works but it might be fine-tuned.h/j½h0h1h2hNh4}rÃ(h6]h7]h8]h9]h;]uh=KSh>hh)]rÄhFX[The parameters of the autoencoder (the default parameters works but it might be fine-tuned.rÅ…rÆ}rÇ(h.X[The parameters of the autoencoder (the default parameters works but it might be fine-tuned.rÈh0Nh=Nh>hh/jÁubaubaubh})rÉ}rÊ(h.Uh/jh0h1h2h€h4}rË(h6]h7]h8]h9]h;]uh=KTh>hh)]rÌhK)rÍ}rÎ(h.XHThe parameters of the classification procedures (default are still good)h/jÉh0h1h2hNh4}rÏ(h6]h7]h8]h9]h;]uh=KTh>hh)]rÐhFXHThe parameters of the classification procedures (default are still good)rÑ…rÒ}rÓ(h.XHThe parameters of the classification procedures (default are still good)rÔh0Nh=Nh>hh/jÍubaubaubeubeubaubeubh+)rÕ}rÖ(h.Uh/h,h0h1h2h3h4}r×(h6]h7]h8]h9]rØhah;]rÙhauh=KWh>hh)]rÚ(h@)rÛ}rÜ(h.XExample datasets and scriptsh/jÕh0h1h2hCh4}rÝ(h6]h7]h8]h9]h;]uh=KWh)]rÞhFXExample datasets and scriptsrß…rà}rá(h.XExample datasets and scriptsrâh/jÛubaubhK)rã}rä(h.X(An omic .tsv file must have this format:h/jÕh0h1h2hNh4}rå(h6]h7]h8]h9]h;]uh=KXh>hh)]ræhFX(An omic .tsv file must have this format:r煁rè}ré(h.X(An omic .tsv file must have this format:rêh0Nh=Nh>hh/jãubaubj)rë}rì(h.X]head mir_dummy.tsv

Samples        dummy_mir_0     dummy_mir_1     dummy_mir_2     dummy_mir_3 ...
sample_test_0  0.469656032287  0.347987447237  0.706633335508  0.440068758445 ...
sample_test_1  0.0453108219657 0.0234642968791 0.593393816691  0.981872970341 ...
sample_test_2  0.908784043793  0.854397550009  0.575879144667  0.553333958713 ...
...
h/jÕh0h1h2jh4}rí(UlanguageXbashrîj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]rïhFX]head mir_dummy.tsv

Samples        dummy_mir_0     dummy_mir_1     dummy_mir_2     dummy_mir_3 ...
sample_test_0  0.469656032287  0.347987447237  0.706633335508  0.440068758445 ...
sample_test_1  0.0453108219657 0.0234642968791 0.593393816691  0.981872970341 ...
sample_test_2  0.908784043793  0.854397550009  0.575879144667  0.553333958713 ...
...
rð…rñ}rò(h.Uh/jëubaubhK)ró}rô(h.X&a survival file must have this format:h/jÕh0h1h2hNh4}rõ(h6]h7]h8]h9]h;]uh=Keh>hh)]röhFX&a survival file must have this format:r÷…rø}rù(h.X&a survival file must have this format:rúh0Nh=Nh>hh/jóubaubj)rû}rü(h.Xhead survival_dummy.tsv

Samples        days event
sample_test_0  134  1
sample_test_1  291  0
sample_test_2  125  1
sample_test_3  43   0
...
h/jÕh0h1h2jh4}rý(UlanguageXbashrþj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]rÿhFXhead survival_dummy.tsv

Samples        days event
sample_test_0  134  1
sample_test_1  291  0
sample_test_2  125  1
sample_test_3  43   0
...
r…r}r(h.Uh/jûubaubhK)r}r(h.X&As examples, we included two datasets:h/jÕh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=Ksh>hh)]rhFX&As examples, we included two datasets:r…r}r	(h.X&As examples, we included two datasets:r
h0Nh=Nh>hh/jubaubhw)r}r(h.Uh/jÕh0h1h2hzh4}r
(h6]h7]h8]h9]h;]uh=Kth>hh)]rh})r}r(h.Uh/jh0h1h2h€h4}r(h6]h7]h8]h9]h;]uh=Kth>hh)]rhK)r}r(h.X4A dummy example dataset in the example/data/ folder:h/jh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=Kth>hh)]r(hFXA dummy example dataset in the r…r}r(h.XA dummy example dataset in the rh0Nh=Nh>hh/jubjw)r}r(h.X
example/data/rh/jh0h1h2j{h4}r(h6]h7]h8]h9]h;]uh=Kh>hh)]rhFX
example/data/r …r!}r"(h.Uh0Nh=Nh>hh/jubaubhFX folder:r#…r$}r%(h.X folder:r&h0Nh=Nh>hh/jubeubaubaubj)r'}r((h.Xìexamples
├── data
│   ├── meth_dummy.tsv
│   ├── mir_dummy.tsv
│   ├── rna_dummy.tsv
│   ├── rna_test_dummy.tsv
│   ├── survival_dummy.tsv
│   └── survival_test_dummy.tsvh/jÕh0h1h2jh4}r)(UlanguageXbashr*j"j#h9]h8]h6]h7]h;]uh=Kh>hh)]r+hFXìexamples
├── data
│   ├── meth_dummy.tsv
│   ├── mir_dummy.tsv
│   ├── rna_dummy.tsv
│   ├── rna_test_dummy.tsv
│   ├── survival_dummy.tsv
│   └── survival_test_dummy.tsvr,…r-}r.(h.Uh/j'ubaubhw)r/}r0(h.Uh/jÕh0h1h2hzh4}r1(h6]h7]h8]h9]h;]uh=K€h>hh)]r2h})r3}r4(h.Uh/j/h0h1h2h€h4}r5(h6]h7]h8]h9]h;]uh=K€h>hh)]r6hK)r7}r8(h.X–And a real dataset in the data folder. This dataset derives from the TCGA HCC cancer dataset. This dataset needs to be decompressed before processing:h/j3h0h1h2hNh4}r9(h6]h7]h8]h9]h;]uh=K€h>hh)]r:(hFXAnd a real dataset in the r;…r<}r=(h.XAnd a real dataset in the r>h0Nh=Nh>hh/j7ubjw)r?}r@(h.XdatarAh/j7h0h1h2j{h4}rB(h6]h7]h8]h9]h;]uh=Kh>hh)]rChFXdatarD…rE}rF(h.Uh0Nh=Nh>hh/j?ubaubhFXx folder. This dataset derives from the TCGA HCC cancer dataset. This dataset needs to be decompressed before processing:rG…rH}rI(h.Xx folder. This dataset derives from the TCGA HCC cancer dataset. This dataset needs to be decompressed before processing:rJh0Nh=Nh>hh/j7ubeubaubaubj)rK}rL(h.X\data
├── meth.tsv.gz
├── mir.tsv.gz
├── rna.tsv.gz
└── survival.tsv
h/jÕh0h1h2jh4}rM(UlanguageXbashrNj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]rOhFX\data
├── meth.tsv.gz
├── mir.tsv.gz
├── rna.tsv.gz
└── survival.tsv
rP…rQ}rR(h.Uh/jKubaubeubh+)rS}rT(h.Uh/h,h0h1h2h3h4}rU(h6]h7]h8]h9]rVh ah;]rWh
auh=K‹h>hh)]rX(h@)rY}rZ(h.XCCreating a simple DeepProg model with one autoencoder for each omich/jSh0h1h2hCh4}r[(h6]h7]h8]h9]h;]uh=K‹h)]r\hFXCCreating a simple DeepProg model with one autoencoder for each omicr]…r^}r_(h.XCCreating a simple DeepProg model with one autoencoder for each omicr`h/jYubaubhK)ra}rb(h.XôFirst, we will build a model using the example dataset from ./examples/data/ (These example files are set as default in the config.py file). We will use them to show how to construct a single DeepProg model inferring a autoencoder for each omich/jSh0h1h2hNh4}rc(h6]h7]h8]h9]h;]uh=KŒh>hh)]rd(hFX<First, we will build a model using the example dataset from re…rf}rg(h.X<First, we will build a model using the example dataset from rhh0Nh=Nh>hh/jaubjw)ri}rj(h.X./examples/data/rkh/jah0h1h2j{h4}rl(h6]h7]h8]h9]h;]uh=Kh>hh)]rmhFX./examples/data/rn…ro}rp(h.Uh0Nh=Nh>hh/jiubaubhFX¨ (These example files are set as default in the config.py file). We will use them to show how to construct a single DeepProg model inferring a autoencoder for each omicrq…rr}rs(h.X¨ (These example files are set as default in the config.py file). We will use them to show how to construct a single DeepProg model inferring a autoencoder for each omicrth0Nh=Nh>hh/jaubeubj)ru}rv(h.X9
# SimDeep class can be used to build one model with one autoencoder for each omic
from simdeep.simdeep_analysis import SimDeep
from simdeep.extract_data import LoadData

help(SimDeep) # to see all the functions
help(LoadData) # to see all the functions related to loading datasets

# Defining training datasets
from simdeep.config import TRAINING_TSV
from simdeep.config import SURVIVAL_TSV

dataset = LoadData(training_tsv=TRAINING_TSV, survival_tsv=SURVIVAL_TSV)

simDeep = SimDeep(dataset=dataset) # instantiate the model with the dummy example training dataset defined in the config file
simDeep.load_training_dataset() # load the training dataset
simDeep.fit() # fit the model

# Defining test datasets
from simdeep.config import TEST_TSV
from simdeep.config import SURVIVAL_TSV_TEST

simDeep.load_new_test_dataset(TEST_TSV, SURVIVAL_TSV_TEST, fname_key='dummy')

# The test set is a dummy rna expression (generated randomly)
print(simDeep.dataset.test_tsv) # Defined in the config file
# The data type of the test set is also defined to match an existing type
print(simDeep.dataset.data_type) # Defined in the config file
simDeep.predict_labels_on_test_dataset() # Perform the classification analysis and label the set dataset

print(simDeep.test_labels)
print(simDeep.test_labels_proba)

simDeep.save_encoder('dummy_encoder.h5')
h/jSh0h1h2jh4}rw(UlanguageXpythonrxj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]ryhFX9
# SimDeep class can be used to build one model with one autoencoder for each omic
from simdeep.simdeep_analysis import SimDeep
from simdeep.extract_data import LoadData

help(SimDeep) # to see all the functions
help(LoadData) # to see all the functions related to loading datasets

# Defining training datasets
from simdeep.config import TRAINING_TSV
from simdeep.config import SURVIVAL_TSV

dataset = LoadData(training_tsv=TRAINING_TSV, survival_tsv=SURVIVAL_TSV)

simDeep = SimDeep(dataset=dataset) # instantiate the model with the dummy example training dataset defined in the config file
simDeep.load_training_dataset() # load the training dataset
simDeep.fit() # fit the model

# Defining test datasets
from simdeep.config import TEST_TSV
from simdeep.config import SURVIVAL_TSV_TEST

simDeep.load_new_test_dataset(TEST_TSV, SURVIVAL_TSV_TEST, fname_key='dummy')

# The test set is a dummy rna expression (generated randomly)
print(simDeep.dataset.test_tsv) # Defined in the config file
# The data type of the test set is also defined to match an existing type
print(simDeep.dataset.data_type) # Defined in the config file
simDeep.predict_labels_on_test_dataset() # Perform the classification analysis and label the set dataset

print(simDeep.test_labels)
print(simDeep.test_labels_proba)

simDeep.save_encoder('dummy_encoder.h5')
rz…r{}r|(h.Uh/juubaubeubh+)r}}r~(h.Uh/h,h0h1h2h3h4}r(h6]h7]h8]h9]r€hah;]rh	auh=K³h>hh)]r‚(h@)rƒ}r„(h.X8Creating a DeepProg model using an ensemble of submodelsh/j}h0h1h2hCh4}r…(h6]h7]h8]h9]h;]uh=K³h)]r†hFX8Creating a DeepProg model using an ensemble of submodelsr‡…rˆ}r‰(h.X8Creating a DeepProg model using an ensemble of submodelsrŠh/jƒubaubhK)r‹}rŒ(h.XÅSecondly, we will build a more complex DeepProg model constituted of an ensemble of sub-models each originated from a subset of the data. For that purpose, we need to use the SimDeepBoosting class:h/j}h0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=Kµh>hh)]rŽ(hFX¯Secondly, we will build a more complex DeepProg model constituted of an ensemble of sub-models each originated from a subset of the data. For that purpose, we need to use the r…r}r‘(h.X¯Secondly, we will build a more complex DeepProg model constituted of an ensemble of sub-models each originated from a subset of the data. For that purpose, we need to use the r’h0Nh=Nh>hh/j‹ubjw)r“}r”(h.XSimDeepBoostingr•h/j‹h0h1h2j{h4}r–(h6]h7]h8]h9]h;]uh=Kh>hh)]r—hFXSimDeepBoostingr˜…r™}rš(h.Uh0Nh=Nh>hh/j“ubaubhFX class:r›…rœ}r(h.X class:ržh0Nh=Nh>hh/j‹ubeubj)rŸ}r (h.X´from simdeep.simdeep_boosting import SimDeepBoosting

help(SimDeepBoosting)

from collections import OrderedDict


path_data = "../examples/data/"
# Example tsv files
tsv_files = OrderedDict([
          ('MIR', 'mir_dummy.tsv'),
          ('METH', 'meth_dummy.tsv'),
          ('RNA', 'rna_dummy.tsv'),
])

# File with survival event
survival_tsv = 'survival_dummy.tsv'

project_name = 'stacked_TestProject'
epochs = 10 # Autoencoder epochs. Other hyperparameters can be fine-tuned. See the example files
seed = 3 # random seed used for reproducibility
nb_it = 5 # This is the number of models to be fitted using only a subset of the training data
nb_threads = 2 # These treads define the number of threads to be used to compute survival function

boosting = SimDeepBoosting(
    nb_threads=nb_threads,
    nb_it=nb_it,
    split_n_fold=3,
    survival_tsv=tsv_files,
    training_tsv=survival_tsv,
    path_data=path_data,
    project_name=project_name,
    path_results=path_data,
    epochs=epochs,
    seed=seed)

# Fit the model
boosting.fit()
# Predict and write the labels
boosting.predict_labels_on_full_dataset()
# Compute internal metrics
boosting.compute_clusters_consistency_for_full_labels()
# COmpute the feature importance
boosting.compute_feature_scores_per_cluster()
# Write the feature importance
boosting.write_feature_score_per_cluster()

boosting.load_new_test_dataset(
    {'RNA': 'rna_dummy.tsv'}, # OMIC file of the test set. It doesnt have to be the same as for training
    'survival_dummy.tsv', # Survival file of the test set
    'TEST_DATA_1', # Name of the test test to be used
)

# Predict the labels on the test dataset
boosting.predict_labels_on_test_dataset()
# Compute C-index
boosting.compute_c_indexes_for_test_dataset()
# See cluster consistency
boosting.compute_clusters_consistency_for_test_labels()

# [EXPERIMENTAL] method to plot the test dataset amongst the class kernel densities
boosting.plot_supervised_kernel_for_test_sets()h/j}h0h1h2jh4}r¡(UlanguageXpythonr¢j"j#h9]h8]h6]h7]h;]uh=Kh>hh)]r£hFX´from simdeep.simdeep_boosting import SimDeepBoosting

help(SimDeepBoosting)

from collections import OrderedDict


path_data = "../examples/data/"
# Example tsv files
tsv_files = OrderedDict([
          ('MIR', 'mir_dummy.tsv'),
          ('METH', 'meth_dummy.tsv'),
          ('RNA', 'rna_dummy.tsv'),
])

# File with survival event
survival_tsv = 'survival_dummy.tsv'

project_name = 'stacked_TestProject'
epochs = 10 # Autoencoder epochs. Other hyperparameters can be fine-tuned. See the example files
seed = 3 # random seed used for reproducibility
nb_it = 5 # This is the number of models to be fitted using only a subset of the training data
nb_threads = 2 # These treads define the number of threads to be used to compute survival function

boosting = SimDeepBoosting(
    nb_threads=nb_threads,
    nb_it=nb_it,
    split_n_fold=3,
    survival_tsv=tsv_files,
    training_tsv=survival_tsv,
    path_data=path_data,
    project_name=project_name,
    path_results=path_data,
    epochs=epochs,
    seed=seed)

# Fit the model
boosting.fit()
# Predict and write the labels
boosting.predict_labels_on_full_dataset()
# Compute internal metrics
boosting.compute_clusters_consistency_for_full_labels()
# COmpute the feature importance
boosting.compute_feature_scores_per_cluster()
# Write the feature importance
boosting.write_feature_score_per_cluster()

boosting.load_new_test_dataset(
    {'RNA': 'rna_dummy.tsv'}, # OMIC file of the test set. It doesnt have to be the same as for training
    'survival_dummy.tsv', # Survival file of the test set
    'TEST_DATA_1', # Name of the test test to be used
)

# Predict the labels on the test dataset
boosting.predict_labels_on_test_dataset()
# Compute C-index
boosting.compute_c_indexes_for_test_dataset()
# See cluster consistency
boosting.compute_clusters_consistency_for_test_labels()

# [EXPERIMENTAL] method to plot the test dataset amongst the class kernel densities
boosting.plot_supervised_kernel_for_test_sets()r¤…r¥}r¦(h.Uh/jŸubaubeubh+)r§}r¨(h.Uh/h,h0h1h2h3h4}r©(h6]h7]h8]h9]rªh"ah;]r«hauh=Køh>hh)]r¬(h@)r­}r®(h.XDCreating a distributed DeepProg model using an ensemble of submodelsh/j§h0h1h2hCh4}r¯(h6]h7]h8]h9]h;]uh=Køh)]r°hFXDCreating a distributed DeepProg model using an ensemble of submodelsr±…r²}r³(h.XDCreating a distributed DeepProg model using an ensemble of submodelsr´h/j­ubaubhK)rµ}r¶(h.X§We can allow DeepProg to distribute the creation of each submodel on different clusters/nodes/CPUs by using the ray framework.
The configuration of the nodes / clusters, or local CPUs to be used needs to be done when instanciating a new ray object with the ray API. It is however quite straightforward to define the number of instances launched on a local machine such as in the example below in which 3 instances are used.h/j§h0h1h2hNh4}r·(h6]h7]h8]h9]h;]uh=Kúh>hh)]r¸(hFX~We can allow DeepProg to distribute the creation of each submodel on different clusters/nodes/CPUs by using the ray framework.r¹…rº}r»(h.X~We can allow DeepProg to distribute the creation of each submodel on different clusters/nodes/CPUs by using the ray framework.r¼h0Nh=Nh>hh/jµubhFX
…r½}r¾(h.Uh0Nh=Nh>hh/jµubhFX†The configuration of the nodes / clusters, or local CPUs to be used needs to be done when instanciating a new ray object with the ray r¿…rÀ}rÁ(h.X†The configuration of the nodes / clusters, or local CPUs to be used needs to be done when instanciating a new ray object with the ray rÂh0Nh=Nh>hh/jµubh“)rÃ}rÄ(h.XAPIh/jµh0h1h2h–h4}rÅ(h˜U%https://ray.readthedocs.io/en/latest/rÆh9]h8]h6]h7]h;]uh=Kúh>hh)]rÇhFXAPIrÈ…rÉ}rÊ(h.XAPIrËh0Nh=Nh>hh/jÃubaubhFXŸ. It is however quite straightforward to define the number of instances launched on a local machine such as in the example below in which 3 instances are used.rÌ…rÍ}rÎ(h.XŸ. It is however quite straightforward to define the number of instances launched on a local machine such as in the example below in which 3 instances are used.rÏh0Nh=Nh>hh/jµubeubj)rÐ}rÑ(h.Xê# Instanciate a ray object that will create multiple workers
import ray
ray.init(num_cpus=3)
# More options can be used (e.g. remote clusters, AWS, memory,...etc...)
# ray can be used locally to maximize the use of CPUs on the local machine
# See ray API: https://ray.readthedocs.io/en/latest/index.html

boosting = SimDeepBoosting(
    ...
    distribute=True, # Additional option to use ray cluster scheduler
    ...
)
...
# Processing
...

# Close clusters and free memory
ray.shutdown()h/j§h0h1h2jh4}rÒ(UlanguageXpythonrÓj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]rÔhFXê# Instanciate a ray object that will create multiple workers
import ray
ray.init(num_cpus=3)
# More options can be used (e.g. remote clusters, AWS, memory,...etc...)
# ray can be used locally to maximize the use of CPUs on the local machine
# See ray API: https://ray.readthedocs.io/en/latest/index.html

boosting = SimDeepBoosting(
    ...
    distribute=True, # Additional option to use ray cluster scheduler
    ...
)
...
# Processing
...

# Close clusters and free memory
ray.shutdown()rÕ…rÖ}r×(h.Uh/jÐubaubeubh+)rØ}rÙ(h.Uh/h,h0h1h2h3h4}rÚ(h6]h7]h8]h9]rÛh'ah;]rÜhauh=Mh>hh)]rÝ(h@)rÞ}rß(h.XExample scriptsh/jØh0h1h2hCh4}rà(h6]h7]h8]h9]h;]uh=Mh)]ráhFXExample scriptsr⅁rã}rä(h.XExample scriptsråh/jÞubaubhK)ræ}rç(h.XzExample scripts are availables in ./examples/ which will assist you to build a model from scratch with test and real data:h/jØh0h1h2hNh4}rè(h6]h7]h8]h9]h;]uh=Mh>hh)]réhFXzExample scripts are availables in ./examples/ which will assist you to build a model from scratch with test and real data:rꅁrë}rì(h.XzExample scripts are availables in ./examples/ which will assist you to build a model from scratch with test and real data:ríh0Nh=Nh>hh/jæubaubj)rî}rï(h.Xaexamples
├── create_autoencoder_from_scratch.py # Construct a simple deeprog model on the dummy example dataset
├── example_with_dummy_data_distributed.py # Process the dummy example dataset using ray
├── example_with_dummy_data.py # Process the dummy example dataset
└── load_3_omics_model.py # Process the example HCC dataset

h/jØh0h1h2jh4}rð(UlanguageXbashrñj"j#h9]h8]h6]h7]h;]uh=Kh>hh)]ròhFXaexamples
├── create_autoencoder_from_scratch.py # Construct a simple deeprog model on the dummy example dataset
├── example_with_dummy_data_distributed.py # Process the dummy example dataset using ray
├── example_with_dummy_data.py # Process the dummy example dataset
└── load_3_omics_model.py # Process the example HCC dataset

ró…rô}rõ(h.Uh/jîubaubeubh+)rö}r÷(h.Uh/h,h0h1h2h3h4}rø(h6]h7]h8]h9]rùh#ah;]rúh
auh=M!h>hh)]rû(h@)rü}rý(h.Xcontact and credentialsh/jöh0h1h2hCh4}rþ(h6]h7]h8]h9]h;]uh=M!h)]rÿhFXcontact and credentialsr…r}r(h.Xcontact and credentialsrh/jüubaubhw)r}r(h.Uh/jöh0h1h2hzh4}r(h6]h7]h8]h9]h;]uh=M"h>hh)]r(h})r}r	(h.Uh/jh0h1h2h€h4}r
(h6]h7]h8]h9]h;]uh=M"h>hh)]rhK)r}r
(h.X Developer: Olivier Poirion (PhD)h/jh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=M"h>hh)]rhFX Developer: Olivier Poirion (PhD)r…r}r(h.X Developer: Olivier Poirion (PhD)rh0Nh=Nh>hh/jubaubaubh})r}r(h.Uh/jh0h1h2h€h4}r(h6]h7]h8]h9]h;]uh=M#h>hh)]rhK)r}r(h.X1contact: opoirion@hawaii.edu, o.poirion@gmail.comh/jh0h1h2hNh4}r(h6]h7]h8]h9]h;]uh=M#h>hh)]rhFX1contact: opoirion@hawaii.edu, o.poirion@gmail.comrr}r(h.X1contact: opoirion@hawaii.edu, o.poirion@gmail.comrh0Nh=Nh>hh/jubaubaubeubeubeubah.UUtransformerr NU
footnote_refsr!}r"Urefnamesr#}r$Usymbol_footnotesr%]r&Uautofootnote_refsr']r(Usymbol_footnote_refsr)]r*U	citationsr+]r,h>hUcurrent_liner-NUtransform_messagesr.]r/Ureporterr0NUid_startr1KU
autofootnotesr2]r3U
citation_refsr4}r5Uindirect_targetsr6]r7Usettingsr8(cdocutils.frontend
Values
r9or:}r;(Ufootnote_backlinksr<KUrecord_dependenciesr=NU
language_coder>Uenr?U	tracebackr@ˆUpep_referencesrANUstrip_commentsrBNU
toc_backlinksrCUentryrDUrfc_base_urlrEUhttps://tools.ietf.org/html/rFU	datestamprGNUreport_levelrHKUsmartquotes_localesrI]rJU_destinationrKNU
halt_levelrLKU
strip_classesrMNhCNUerror_encoding_error_handlerrNUbackslashreplacerOUdebugrPNUembed_stylesheetrQ‰Uoutput_encoding_error_handlerrRUstrictrSU
sectnum_xformrTKUdump_transformsrUNU
docinfo_xformrVKUwarning_streamrWNUpep_file_url_templaterXUpep-%04drYUexit_status_levelrZKUconfigr[NUstrict_visitorr\NUcloak_email_addressesr]ˆUtrim_footnote_reference_spacer^‰Uenvr_NUdump_pseudo_xmlr`NUexpose_internalsraNUsectsubtitle_xformrb‰Usource_linkrcNUrfc_referencesrdNUoutput_encodingreUutf-8rfU
source_urlrgNUinput_encodingrhU	utf-8-sigriU_disable_configrjNU	id_prefixrkUUcharacter_level_inline_markuprl‰U	tab_widthrmKUerror_encodingrnUUTF-8roU_sourcerph1U	generatorrqNUdump_internalsrrNUsmart_quotesrsˆUpep_base_urlrtU https://www.python.org/dev/peps/ruUsyntax_highlightrvUlongrwUinput_encoding_error_handlerrxjSUauto_id_prefixryUidrzUdoctitle_xformr{‰Ustrip_elements_with_classesr|NU
_config_filesr}]Ufile_insertion_enabledr~ˆUraw_enabledrKU
dump_settingsr€NubUsymbol_footnote_startrKh9}r‚(hj¢hhih(h,h"j§h jShj}h%j[h&j?h#jöh$jEh'jØhjÕh!jáuUsubstitution_namesrƒ}r„h2h>h4}r…(h6]h9]h8]Usourceh1h7]h;]uU	footnotesr†]r‡Urefidsrˆ}r‰ub.