[637b40]: / model-description / architecture-concise.txt

Download this file

76 lines (72 with data), 7.1 kB

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
=========================================================================================================
Layer (type:depth-idx) Input Shape Output Shape
=========================================================================================================
Unet -- --
├─EfficientNetEncoder-B5: 1-1 [1, 3, 512, 512] [1, 3, 512, 512]
│ └─Conv2d: 2-2 [1, 3, 512, 512] [1, 48, 256, 256]
│ └─BatchNorm2d: 2-3 [1, 48, 256, 256] [1, 48, 256, 256]
│ └─Swish: 2-4 [1, 48, 256, 256] [1, 48, 256, 256]
│ └─Sequential: 2 -- --
│ │ └─Sequential: 3-1 [1, 48, 256, 256] [1, 24, 256, 256]
│ │ │ └─DepthwiseSepConv: 4-1 -> 4-3 [1, 48, 256, 256] [1, 24, 256, 256]
│ │ └─Sequential: 3-2 [1, 24, 256, 256] [1, 40, 128, 128]
│ │ │ └─InvertedResidual: 4-4 -> 4-8 [1, 24, 256, 256] [1, 40, 128, 128]
│ │ └─Sequential: 3-3 [1, 40, 128, 128] [1, 64, 64, 64]
│ │ │ └─InvertedResidual: 4-9 -> 4-13 [1, 40, 128, 128] [1, 64, 64, 64]
│ │ └─Sequential: 3-4 [1, 64, 64, 64] [1, 128, 32, 32]
│ │ │ └─InvertedResidual: 4-14 -> 4-20 [1, 64, 64, 64] [1, 128, 32, 32]
│ │ └─Sequential: 3-5 [1, 128, 32, 32] [1, 176, 32, 32]
│ │ │ └─InvertedResidual: 4-21 -> 4-27 [1, 128, 32, 32] [1, 176, 32, 32]
│ │ └─Sequential: 3-6 [1, 176, 32, 32] [1, 304, 16, 16]
│ │ │ └─InvertedResidual: 4-28 -> 4-36 [1, 176, 32, 32] [1, 304, 16, 16]
│ │ └─Sequential: 3-7 [1, 304, 16, 16] [1, 512, 16, 16]
│ │ │ └─InvertedResidual: 4-37 -> 4-39 [1, 304, 16, 16] [1, 512, 16, 16]
├─UnetDecoder: 1-2 [1, 3, 512, 512] [1, 32, 512, 512]
│ └─Identity: 2-5 [1, 512, 16, 16] [1, 512, 16, 16]
│ └─ModuleList: 2-1 -- --
│ │ └─DecoderBlock: 3-8 [1, 512, 16, 16] [1, 512, 32, 32]
│ │ └─DecoderBlock: 3-9 [1, 512, 32, 32] [1, 256, 64, 64]
│ │ └─DecoderBlock: 3-10 [1, 256, 64, 64] [1, 128, 128, 128]
│ │ └─DecoderBlock: 3-11 [1, 128, 128, 128] [1, 64, 256, 256]
│ │ └─DecoderBlock: 3-12 [1, 64, 256, 256] [1, 32, 512, 512]
├─SegmentationHead: 1-3 [1, 32, 512, 512] [1, 1, 512, 512]
│ └─Conv2d: 2-6 [1, 32, 512, 512] [1, 1, 512, 512]
│ └─Identity: 2-7 [1, 1, 512, 512] [1, 1, 512, 512]
│ └─Activation: 2-8 [1, 1, 512, 512] [1, 1, 512, 512]
│ │ └─Identity: 3-13 [1, 1, 512, 512] [1, 1, 512, 512]
=========================================================================================================
====================================================================================================
Layer (type:depth-idx) Abstracted Input Shape Abstracted Output Shape
====================================================================================================
DepthwiseSeparableConv: 1-1 [1, C, H, W] [1, C, H, W]
└─Conv2d: 2-1 [1, C, H, W] [1, C, H, W]
└─BatchNorm2d: 2-2 [1, C, H, W] [1, C, H, W]
└─Swish: 2-3 [1, C, H, W] [1, C, H, W]
└─SqueezeExcite: 2-4 [1, C, H, W] [1, C, H, W]
└─Conv2d: 2-5 [1, C, H, W] [1, C, H, W]
└─BatchNorm2d: 2-6 [1, C, H, W] [1, C, H, W]
└─Identity: 2-7 [1, C, H, W] [1, C, H, W]
InvertedResidual: 1-1 [1, C2, H2, W2] [1, C2, H2, W2]
└─Conv2d: 2-1 [1, C2, H2, W2] [1, C2 x 6, H2, W2]
└─BatchNorm2d: 2-2 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─Swish: 2-3 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─Conv2d: 2-4 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─BatchNorm2d: 2-5 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─Swish: 2-6 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─SqueezeExcite: 2-7 [1, C2 x 6, H2, W2] [1, C2 x 6, H2, W2]
└─Conv2d: 2-8 [1, C2 x 6, H2, W2] [1, C2, H2, W2]
└─BatchNorm2d: 2-9 [1, C2, H2, W2] [1, C2, H2, W2]
DecoderBlock: 1-1 [1, C3 x 8, H3, W3] [1, C3 x 4, H3 x 2, W3 x 2]
└─Attention: 2-1 [1, C3 x 9, H3 x 2, W3 x 2] [1, C3 x 9, H3 x 2, W3 x 2]
│ └─Identity: 3-1 [1, C3 x 9, H3 x 2, W3 x 2] [1, C3 x 9, H3 x 2, W3 x 2]
└─Conv2dReLU: 2-2 [1, C3 x 9, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─Conv2d: 3-1 [1, C3 x 9, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─BatchNorm2d: 3-2 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─ReLU: 3-3 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
└─Conv2dReLU: 2-3 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─Conv2d: 3-1 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─BatchNorm2d: 3-2 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─ReLU: 3-3 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
└─Attention: 2-4 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
│ └─Identity: 2-1 [1, C3 x 4, H3 x 2, W3 x 2] [1, C3 x 4, H3 x 2, W3 x 2]
====================================================================================================