tomaarsen HF staff commited on
Commit
3ac17d1
1 Parent(s): 7702ced

Add new SentenceTransformer model with an openvino backend

Browse files
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "sentence-transformers-testing/stsb-bert-tiny-safetensors",
3
  "architectures": [
4
  "BertModel"
5
  ],
 
1
  {
2
+ "_name_or_path": "sentence-transformers-testing/stsb-bert-tiny-openvino",
3
  "architectures": [
4
  "BertModel"
5
  ],
config_sentence_transformers.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "__version__": {
3
- "sentence_transformers": "3.1.0.dev0",
4
  "transformers": "4.43.4",
5
  "pytorch": "2.5.0.dev20240807+cu121"
6
  },
 
1
  {
2
  "__version__": {
3
+ "sentence_transformers": "3.2.0.dev0",
4
  "transformers": "4.43.4",
5
  "pytorch": "2.5.0.dev20240807+cu121"
6
  },
openvino/openvino_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ab5151e5f9953fc0fd7a6e43c42d6169926230fa47063bc88a52eac22650cb8e
3
+ size 17481872
openvino/openvino_model.xml ADDED
@@ -0,0 +1,2845 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="Model0" version="11">
3
+ <layers>
4
+ <layer id="2" name="input_ids" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="input_ids">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="attention_mask" type="Parameter" version="opset1">
14
+ <data shape="?,?" element_type="i64" />
15
+ <output>
16
+ <port id="0" precision="I64" names="attention_mask">
17
+ <dim>-1</dim>
18
+ <dim>-1</dim>
19
+ </port>
20
+ </output>
21
+ </layer>
22
+ <layer id="0" name="token_type_ids" type="Parameter" version="opset1">
23
+ <data shape="?,?" element_type="i64" />
24
+ <output>
25
+ <port id="0" precision="I64" names="token_type_ids">
26
+ <dim>-1</dim>
27
+ <dim>-1</dim>
28
+ </port>
29
+ </output>
30
+ </layer>
31
+ <layer id="3" name="self.embeddings.word_embeddings.weight" type="Const" version="opset1">
32
+ <data element_type="f32" shape="30522, 128" offset="0" size="15627264" />
33
+ <output>
34
+ <port id="0" precision="FP32" names="self.embeddings.word_embeddings.weight">
35
+ <dim>30522</dim>
36
+ <dim>128</dim>
37
+ </port>
38
+ </output>
39
+ </layer>
40
+ <layer id="4" name="__module.embeddings.word_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
41
+ <data destination_type="i32" />
42
+ <input>
43
+ <port id="0" precision="I64">
44
+ <dim>-1</dim>
45
+ <dim>-1</dim>
46
+ </port>
47
+ </input>
48
+ <output>
49
+ <port id="1" precision="I32">
50
+ <dim>-1</dim>
51
+ <dim>-1</dim>
52
+ </port>
53
+ </output>
54
+ </layer>
55
+ <layer id="5" name="__module.embeddings.word_embeddings/aten::embedding/Constant" type="Const" version="opset1">
56
+ <data element_type="i32" shape="" offset="15627264" size="4" />
57
+ <output>
58
+ <port id="0" precision="I32" />
59
+ </output>
60
+ </layer>
61
+ <layer id="6" name="__module.embeddings.word_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
62
+ <data batch_dims="0" />
63
+ <input>
64
+ <port id="0" precision="FP32">
65
+ <dim>30522</dim>
66
+ <dim>128</dim>
67
+ </port>
68
+ <port id="1" precision="I32">
69
+ <dim>-1</dim>
70
+ <dim>-1</dim>
71
+ </port>
72
+ <port id="2" precision="I32" />
73
+ </input>
74
+ <output>
75
+ <port id="3" precision="FP32" names="79,inputs_embeds">
76
+ <dim>-1</dim>
77
+ <dim>-1</dim>
78
+ <dim>128</dim>
79
+ </port>
80
+ </output>
81
+ </layer>
82
+ <layer id="7" name="self.embeddings.token_type_embeddings.weight" type="Const" version="opset1">
83
+ <data element_type="f32" shape="2, 128" offset="15627268" size="1024" />
84
+ <output>
85
+ <port id="0" precision="FP32" names="self.embeddings.token_type_embeddings.weight">
86
+ <dim>2</dim>
87
+ <dim>128</dim>
88
+ </port>
89
+ </output>
90
+ </layer>
91
+ <layer id="8" name="__module.embeddings.token_type_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
92
+ <data destination_type="i32" />
93
+ <input>
94
+ <port id="0" precision="I64">
95
+ <dim>-1</dim>
96
+ <dim>-1</dim>
97
+ </port>
98
+ </input>
99
+ <output>
100
+ <port id="1" precision="I32">
101
+ <dim>-1</dim>
102
+ <dim>-1</dim>
103
+ </port>
104
+ </output>
105
+ </layer>
106
+ <layer id="9" name="__module.embeddings.token_type_embeddings/aten::embedding/Constant" type="Const" version="opset1">
107
+ <data element_type="i32" shape="" offset="15627264" size="4" />
108
+ <output>
109
+ <port id="0" precision="I32" />
110
+ </output>
111
+ </layer>
112
+ <layer id="10" name="__module.embeddings.token_type_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
113
+ <data batch_dims="0" />
114
+ <input>
115
+ <port id="0" precision="FP32">
116
+ <dim>2</dim>
117
+ <dim>128</dim>
118
+ </port>
119
+ <port id="1" precision="I32">
120
+ <dim>-1</dim>
121
+ <dim>-1</dim>
122
+ </port>
123
+ <port id="2" precision="I32" />
124
+ </input>
125
+ <output>
126
+ <port id="3" precision="FP32" names="81,token_type_embeddings.1">
127
+ <dim>-1</dim>
128
+ <dim>-1</dim>
129
+ <dim>128</dim>
130
+ </port>
131
+ </output>
132
+ </layer>
133
+ <layer id="11" name="__module.embeddings/aten::add/Add" type="Add" version="opset1">
134
+ <data auto_broadcast="numpy" />
135
+ <input>
136
+ <port id="0" precision="FP32">
137
+ <dim>-1</dim>
138
+ <dim>-1</dim>
139
+ <dim>128</dim>
140
+ </port>
141
+ <port id="1" precision="FP32">
142
+ <dim>-1</dim>
143
+ <dim>-1</dim>
144
+ <dim>128</dim>
145
+ </port>
146
+ </input>
147
+ <output>
148
+ <port id="2" precision="FP32" names="82_1">
149
+ <dim>-1</dim>
150
+ <dim>-1</dim>
151
+ <dim>128</dim>
152
+ </port>
153
+ </output>
154
+ </layer>
155
+ <layer id="12" name="self.embeddings.position_embeddings.weight" type="Const" version="opset1">
156
+ <data element_type="f32" shape="512, 128" offset="15628292" size="262144" />
157
+ <output>
158
+ <port id="0" precision="FP32" names="self.embeddings.position_embeddings.weight">
159
+ <dim>512</dim>
160
+ <dim>128</dim>
161
+ </port>
162
+ </output>
163
+ </layer>
164
+ <layer id="13" name="__module.embeddings/aten::slice/Slice" type="Const" version="opset1">
165
+ <data element_type="i64" shape="1, 512" offset="15890436" size="4096" />
166
+ <output>
167
+ <port id="0" precision="I64" names="76">
168
+ <dim>1</dim>
169
+ <dim>512</dim>
170
+ </port>
171
+ </output>
172
+ </layer>
173
+ <layer id="14" name="__module.embeddings/aten::slice/Reshape" type="Const" version="opset1">
174
+ <data element_type="i64" shape="1" offset="15894532" size="8" />
175
+ <output>
176
+ <port id="0" precision="I64">
177
+ <dim>1</dim>
178
+ </port>
179
+ </output>
180
+ </layer>
181
+ <layer id="15" name="ShapeOf_4117" type="ShapeOf" version="opset3">
182
+ <data output_type="i64" />
183
+ <input>
184
+ <port id="0" precision="I64">
185
+ <dim>-1</dim>
186
+ <dim>-1</dim>
187
+ </port>
188
+ </input>
189
+ <output>
190
+ <port id="1" precision="I64">
191
+ <dim>2</dim>
192
+ </port>
193
+ </output>
194
+ </layer>
195
+ <layer id="16" name="Constant_4238" type="Const" version="opset1">
196
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
197
+ <output>
198
+ <port id="0" precision="I64">
199
+ <dim>1</dim>
200
+ </port>
201
+ </output>
202
+ </layer>
203
+ <layer id="17" name="Constant_4119" type="Const" version="opset1">
204
+ <data element_type="i64" shape="" offset="15894532" size="8" />
205
+ <output>
206
+ <port id="0" precision="I64" />
207
+ </output>
208
+ </layer>
209
+ <layer id="18" name="Gather_4120" type="Gather" version="opset8">
210
+ <data batch_dims="0" />
211
+ <input>
212
+ <port id="0" precision="I64">
213
+ <dim>2</dim>
214
+ </port>
215
+ <port id="1" precision="I64">
216
+ <dim>1</dim>
217
+ </port>
218
+ <port id="2" precision="I64" />
219
+ </input>
220
+ <output>
221
+ <port id="3" precision="I64" names="10,72,74,75,8">
222
+ <dim>1</dim>
223
+ </port>
224
+ </output>
225
+ </layer>
226
+ <layer id="19" name="__module.embeddings/aten::slice/Reshape_2" type="Const" version="opset1">
227
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
228
+ <output>
229
+ <port id="0" precision="I64">
230
+ <dim>1</dim>
231
+ </port>
232
+ </output>
233
+ </layer>
234
+ <layer id="20" name="__module.embeddings/aten::slice/Reshape_3" type="Const" version="opset1">
235
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
236
+ <output>
237
+ <port id="0" precision="I64">
238
+ <dim>1</dim>
239
+ </port>
240
+ </output>
241
+ </layer>
242
+ <layer id="21" name="__module.embeddings/aten::slice/Slice_1" type="Slice" version="opset8">
243
+ <input>
244
+ <port id="0" precision="I64">
245
+ <dim>1</dim>
246
+ <dim>512</dim>
247
+ </port>
248
+ <port id="1" precision="I64">
249
+ <dim>1</dim>
250
+ </port>
251
+ <port id="2" precision="I64">
252
+ <dim>1</dim>
253
+ </port>
254
+ <port id="3" precision="I64">
255
+ <dim>1</dim>
256
+ </port>
257
+ <port id="4" precision="I64">
258
+ <dim>1</dim>
259
+ </port>
260
+ </input>
261
+ <output>
262
+ <port id="5" precision="I64" names="77">
263
+ <dim>1</dim>
264
+ <dim>-1</dim>
265
+ </port>
266
+ </output>
267
+ </layer>
268
+ <layer id="22" name="__module.embeddings.position_embeddings/aten::embedding/Convert" type="Convert" version="opset1">
269
+ <data destination_type="i32" />
270
+ <input>
271
+ <port id="0" precision="I64">
272
+ <dim>1</dim>
273
+ <dim>-1</dim>
274
+ </port>
275
+ </input>
276
+ <output>
277
+ <port id="1" precision="I32">
278
+ <dim>1</dim>
279
+ <dim>-1</dim>
280
+ </port>
281
+ </output>
282
+ </layer>
283
+ <layer id="23" name="__module.embeddings.position_embeddings/aten::embedding/Constant" type="Const" version="opset1">
284
+ <data element_type="i32" shape="" offset="15627264" size="4" />
285
+ <output>
286
+ <port id="0" precision="I32" />
287
+ </output>
288
+ </layer>
289
+ <layer id="24" name="__module.embeddings.position_embeddings/aten::embedding/Gather" type="Gather" version="opset8">
290
+ <data batch_dims="0" />
291
+ <input>
292
+ <port id="0" precision="FP32">
293
+ <dim>512</dim>
294
+ <dim>128</dim>
295
+ </port>
296
+ <port id="1" precision="I32">
297
+ <dim>1</dim>
298
+ <dim>-1</dim>
299
+ </port>
300
+ <port id="2" precision="I32" />
301
+ </input>
302
+ <output>
303
+ <port id="3" precision="FP32" names="84,position_embeddings.1">
304
+ <dim>1</dim>
305
+ <dim>-1</dim>
306
+ <dim>128</dim>
307
+ </port>
308
+ </output>
309
+ </layer>
310
+ <layer id="25" name="__module.embeddings/aten::add_/Add" type="Add" version="opset1">
311
+ <data auto_broadcast="numpy" />
312
+ <input>
313
+ <port id="0" precision="FP32">
314
+ <dim>-1</dim>
315
+ <dim>-1</dim>
316
+ <dim>128</dim>
317
+ </port>
318
+ <port id="1" precision="FP32">
319
+ <dim>1</dim>
320
+ <dim>-1</dim>
321
+ <dim>128</dim>
322
+ </port>
323
+ </input>
324
+ <output>
325
+ <port id="2" precision="FP32" names="82,embeddings.1">
326
+ <dim>-1</dim>
327
+ <dim>-1</dim>
328
+ <dim>128</dim>
329
+ </port>
330
+ </output>
331
+ </layer>
332
+ <layer id="26" name="__module.embeddings.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
333
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
334
+ <output>
335
+ <port id="0" precision="I32">
336
+ <dim>1</dim>
337
+ </port>
338
+ </output>
339
+ </layer>
340
+ <layer id="27" name="__module.embeddings.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
341
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
342
+ <input>
343
+ <port id="0" precision="FP32">
344
+ <dim>-1</dim>
345
+ <dim>-1</dim>
346
+ <dim>128</dim>
347
+ </port>
348
+ <port id="1" precision="I32">
349
+ <dim>1</dim>
350
+ </port>
351
+ </input>
352
+ <output>
353
+ <port id="2" precision="FP32">
354
+ <dim>-1</dim>
355
+ <dim>-1</dim>
356
+ <dim>128</dim>
357
+ </port>
358
+ </output>
359
+ </layer>
360
+ <layer id="28" name="Constant_4060" type="Const" version="opset1">
361
+ <data element_type="f32" shape="1, 1, 128" offset="15894552" size="512" />
362
+ <output>
363
+ <port id="0" precision="FP32">
364
+ <dim>1</dim>
365
+ <dim>1</dim>
366
+ <dim>128</dim>
367
+ </port>
368
+ </output>
369
+ </layer>
370
+ <layer id="29" name="__module.embeddings.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
371
+ <data auto_broadcast="numpy" />
372
+ <input>
373
+ <port id="0" precision="FP32">
374
+ <dim>-1</dim>
375
+ <dim>-1</dim>
376
+ <dim>128</dim>
377
+ </port>
378
+ <port id="1" precision="FP32">
379
+ <dim>1</dim>
380
+ <dim>1</dim>
381
+ <dim>128</dim>
382
+ </port>
383
+ </input>
384
+ <output>
385
+ <port id="2" precision="FP32">
386
+ <dim>-1</dim>
387
+ <dim>-1</dim>
388
+ <dim>128</dim>
389
+ </port>
390
+ </output>
391
+ </layer>
392
+ <layer id="30" name="Constant_4061" type="Const" version="opset1">
393
+ <data element_type="f32" shape="1, 1, 128" offset="15895064" size="512" />
394
+ <output>
395
+ <port id="0" precision="FP32">
396
+ <dim>1</dim>
397
+ <dim>1</dim>
398
+ <dim>128</dim>
399
+ </port>
400
+ </output>
401
+ </layer>
402
+ <layer id="31" name="__module.embeddings.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
403
+ <data auto_broadcast="numpy" />
404
+ <input>
405
+ <port id="0" precision="FP32">
406
+ <dim>-1</dim>
407
+ <dim>-1</dim>
408
+ <dim>128</dim>
409
+ </port>
410
+ <port id="1" precision="FP32">
411
+ <dim>1</dim>
412
+ <dim>1</dim>
413
+ <dim>128</dim>
414
+ </port>
415
+ </input>
416
+ <output>
417
+ <port id="2" precision="FP32" names="89,input.1">
418
+ <dim>-1</dim>
419
+ <dim>-1</dim>
420
+ <dim>128</dim>
421
+ </port>
422
+ </output>
423
+ </layer>
424
+ <layer id="32" name="self.encoder.layer.0.attention.self.query.weight" type="Const" version="opset1">
425
+ <data element_type="f32" shape="128, 128" offset="15895576" size="65536" />
426
+ <output>
427
+ <port id="0" precision="FP32" names="self.encoder.layer.0.attention.self.query.weight">
428
+ <dim>128</dim>
429
+ <dim>128</dim>
430
+ </port>
431
+ </output>
432
+ </layer>
433
+ <layer id="33" name="__module.encoder.layer.0.attention.self.query/aten::linear/MatMul" type="MatMul" version="opset1">
434
+ <data transpose_a="false" transpose_b="true" />
435
+ <input>
436
+ <port id="0" precision="FP32">
437
+ <dim>-1</dim>
438
+ <dim>-1</dim>
439
+ <dim>128</dim>
440
+ </port>
441
+ <port id="1" precision="FP32">
442
+ <dim>128</dim>
443
+ <dim>128</dim>
444
+ </port>
445
+ </input>
446
+ <output>
447
+ <port id="2" precision="FP32">
448
+ <dim>-1</dim>
449
+ <dim>-1</dim>
450
+ <dim>128</dim>
451
+ </port>
452
+ </output>
453
+ </layer>
454
+ <layer id="34" name="Constant_4062" type="Const" version="opset1">
455
+ <data element_type="f32" shape="1, 1, 128" offset="15961112" size="512" />
456
+ <output>
457
+ <port id="0" precision="FP32">
458
+ <dim>1</dim>
459
+ <dim>1</dim>
460
+ <dim>128</dim>
461
+ </port>
462
+ </output>
463
+ </layer>
464
+ <layer id="35" name="__module.encoder.layer.0.attention.self.query/aten::linear/Add" type="Add" version="opset1">
465
+ <data auto_broadcast="numpy" />
466
+ <input>
467
+ <port id="0" precision="FP32">
468
+ <dim>-1</dim>
469
+ <dim>-1</dim>
470
+ <dim>128</dim>
471
+ </port>
472
+ <port id="1" precision="FP32">
473
+ <dim>1</dim>
474
+ <dim>1</dim>
475
+ <dim>128</dim>
476
+ </port>
477
+ </input>
478
+ <output>
479
+ <port id="2" precision="FP32" names="120,x.1">
480
+ <dim>-1</dim>
481
+ <dim>-1</dim>
482
+ <dim>128</dim>
483
+ </port>
484
+ </output>
485
+ </layer>
486
+ <layer id="36" name="__module.encoder.layer.0.attention.self/prim::ListConstruct/Concat" type="Const" version="opset1">
487
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
488
+ <output>
489
+ <port id="0" precision="I64">
490
+ <dim>4</dim>
491
+ </port>
492
+ </output>
493
+ </layer>
494
+ <layer id="37" name="__module.encoder.layer.0.attention.self/aten::view/Reshape" type="Reshape" version="opset1">
495
+ <data special_zero="true" />
496
+ <input>
497
+ <port id="0" precision="FP32">
498
+ <dim>-1</dim>
499
+ <dim>-1</dim>
500
+ <dim>128</dim>
501
+ </port>
502
+ <port id="1" precision="I64">
503
+ <dim>4</dim>
504
+ </port>
505
+ </input>
506
+ <output>
507
+ <port id="2" precision="FP32" names="124,x.3">
508
+ <dim>-1</dim>
509
+ <dim>-1</dim>
510
+ <dim>2</dim>
511
+ <dim>64</dim>
512
+ </port>
513
+ </output>
514
+ </layer>
515
+ <layer id="38" name="Constant_238" type="Const" version="opset1">
516
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
517
+ <output>
518
+ <port id="0" precision="I64" names="125">
519
+ <dim>4</dim>
520
+ </port>
521
+ </output>
522
+ </layer>
523
+ <layer id="39" name="__module.encoder.layer.0.attention.self/aten::permute/Transpose" type="Transpose" version="opset1">
524
+ <input>
525
+ <port id="0" precision="FP32">
526
+ <dim>-1</dim>
527
+ <dim>-1</dim>
528
+ <dim>2</dim>
529
+ <dim>64</dim>
530
+ </port>
531
+ <port id="1" precision="I64">
532
+ <dim>4</dim>
533
+ </port>
534
+ </input>
535
+ <output>
536
+ <port id="2" precision="FP32" names="126">
537
+ <dim>-1</dim>
538
+ <dim>2</dim>
539
+ <dim>-1</dim>
540
+ <dim>64</dim>
541
+ </port>
542
+ </output>
543
+ </layer>
544
+ <layer id="40" name="self.encoder.layer.0.attention.self.key.weight" type="Const" version="opset1">
545
+ <data element_type="f32" shape="128, 128" offset="15961688" size="65536" />
546
+ <output>
547
+ <port id="0" precision="FP32" names="self.encoder.layer.0.attention.self.key.weight">
548
+ <dim>128</dim>
549
+ <dim>128</dim>
550
+ </port>
551
+ </output>
552
+ </layer>
553
+ <layer id="41" name="__module.encoder.layer.0.attention.self.key/aten::linear/MatMul" type="MatMul" version="opset1">
554
+ <data transpose_a="false" transpose_b="true" />
555
+ <input>
556
+ <port id="0" precision="FP32">
557
+ <dim>-1</dim>
558
+ <dim>-1</dim>
559
+ <dim>128</dim>
560
+ </port>
561
+ <port id="1" precision="FP32">
562
+ <dim>128</dim>
563
+ <dim>128</dim>
564
+ </port>
565
+ </input>
566
+ <output>
567
+ <port id="2" precision="FP32">
568
+ <dim>-1</dim>
569
+ <dim>-1</dim>
570
+ <dim>128</dim>
571
+ </port>
572
+ </output>
573
+ </layer>
574
+ <layer id="42" name="Constant_4063" type="Const" version="opset1">
575
+ <data element_type="f32" shape="1, 1, 128" offset="16027224" size="512" />
576
+ <output>
577
+ <port id="0" precision="FP32">
578
+ <dim>1</dim>
579
+ <dim>1</dim>
580
+ <dim>128</dim>
581
+ </port>
582
+ </output>
583
+ </layer>
584
+ <layer id="43" name="__module.encoder.layer.0.attention.self.key/aten::linear/Add" type="Add" version="opset1">
585
+ <data auto_broadcast="numpy" />
586
+ <input>
587
+ <port id="0" precision="FP32">
588
+ <dim>-1</dim>
589
+ <dim>-1</dim>
590
+ <dim>128</dim>
591
+ </port>
592
+ <port id="1" precision="FP32">
593
+ <dim>1</dim>
594
+ <dim>1</dim>
595
+ <dim>128</dim>
596
+ </port>
597
+ </input>
598
+ <output>
599
+ <port id="2" precision="FP32" names="129,x.5">
600
+ <dim>-1</dim>
601
+ <dim>-1</dim>
602
+ <dim>128</dim>
603
+ </port>
604
+ </output>
605
+ </layer>
606
+ <layer id="44" name="__module.encoder.layer.0.attention.self/prim::ListConstruct/Concat_1" type="Const" version="opset1">
607
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
608
+ <output>
609
+ <port id="0" precision="I64">
610
+ <dim>4</dim>
611
+ </port>
612
+ </output>
613
+ </layer>
614
+ <layer id="45" name="__module.encoder.layer.0.attention.self/aten::view/Reshape_1" type="Reshape" version="opset1">
615
+ <data special_zero="true" />
616
+ <input>
617
+ <port id="0" precision="FP32">
618
+ <dim>-1</dim>
619
+ <dim>-1</dim>
620
+ <dim>128</dim>
621
+ </port>
622
+ <port id="1" precision="I64">
623
+ <dim>4</dim>
624
+ </port>
625
+ </input>
626
+ <output>
627
+ <port id="2" precision="FP32" names="133,x.7">
628
+ <dim>-1</dim>
629
+ <dim>-1</dim>
630
+ <dim>2</dim>
631
+ <dim>64</dim>
632
+ </port>
633
+ </output>
634
+ </layer>
635
+ <layer id="46" name="Constant_263" type="Const" version="opset1">
636
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
637
+ <output>
638
+ <port id="0" precision="I64" names="134">
639
+ <dim>4</dim>
640
+ </port>
641
+ </output>
642
+ </layer>
643
+ <layer id="47" name="__module.encoder.layer.0.attention.self/aten::permute/Transpose_1" type="Transpose" version="opset1">
644
+ <input>
645
+ <port id="0" precision="FP32">
646
+ <dim>-1</dim>
647
+ <dim>-1</dim>
648
+ <dim>2</dim>
649
+ <dim>64</dim>
650
+ </port>
651
+ <port id="1" precision="I64">
652
+ <dim>4</dim>
653
+ </port>
654
+ </input>
655
+ <output>
656
+ <port id="2" precision="FP32" names="135">
657
+ <dim>-1</dim>
658
+ <dim>2</dim>
659
+ <dim>-1</dim>
660
+ <dim>64</dim>
661
+ </port>
662
+ </output>
663
+ </layer>
664
+ <layer id="48" name="self.encoder.layer.0.attention.self.value.weight" type="Const" version="opset1">
665
+ <data element_type="f32" shape="128, 128" offset="16027736" size="65536" />
666
+ <output>
667
+ <port id="0" precision="FP32" names="self.encoder.layer.0.attention.self.value.weight">
668
+ <dim>128</dim>
669
+ <dim>128</dim>
670
+ </port>
671
+ </output>
672
+ </layer>
673
+ <layer id="49" name="__module.encoder.layer.0.attention.self.value/aten::linear/MatMul" type="MatMul" version="opset1">
674
+ <data transpose_a="false" transpose_b="true" />
675
+ <input>
676
+ <port id="0" precision="FP32">
677
+ <dim>-1</dim>
678
+ <dim>-1</dim>
679
+ <dim>128</dim>
680
+ </port>
681
+ <port id="1" precision="FP32">
682
+ <dim>128</dim>
683
+ <dim>128</dim>
684
+ </port>
685
+ </input>
686
+ <output>
687
+ <port id="2" precision="FP32">
688
+ <dim>-1</dim>
689
+ <dim>-1</dim>
690
+ <dim>128</dim>
691
+ </port>
692
+ </output>
693
+ </layer>
694
+ <layer id="50" name="Constant_4064" type="Const" version="opset1">
695
+ <data element_type="f32" shape="1, 1, 128" offset="16093272" size="512" />
696
+ <output>
697
+ <port id="0" precision="FP32">
698
+ <dim>1</dim>
699
+ <dim>1</dim>
700
+ <dim>128</dim>
701
+ </port>
702
+ </output>
703
+ </layer>
704
+ <layer id="51" name="__module.encoder.layer.0.attention.self.value/aten::linear/Add" type="Add" version="opset1">
705
+ <data auto_broadcast="numpy" />
706
+ <input>
707
+ <port id="0" precision="FP32">
708
+ <dim>-1</dim>
709
+ <dim>-1</dim>
710
+ <dim>128</dim>
711
+ </port>
712
+ <port id="1" precision="FP32">
713
+ <dim>1</dim>
714
+ <dim>1</dim>
715
+ <dim>128</dim>
716
+ </port>
717
+ </input>
718
+ <output>
719
+ <port id="2" precision="FP32" names="138,x.9">
720
+ <dim>-1</dim>
721
+ <dim>-1</dim>
722
+ <dim>128</dim>
723
+ </port>
724
+ </output>
725
+ </layer>
726
+ <layer id="52" name="__module.encoder.layer.0.attention.self/prim::ListConstruct/Concat_2" type="Const" version="opset1">
727
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
728
+ <output>
729
+ <port id="0" precision="I64">
730
+ <dim>4</dim>
731
+ </port>
732
+ </output>
733
+ </layer>
734
+ <layer id="53" name="__module.encoder.layer.0.attention.self/aten::view/Reshape_2" type="Reshape" version="opset1">
735
+ <data special_zero="true" />
736
+ <input>
737
+ <port id="0" precision="FP32">
738
+ <dim>-1</dim>
739
+ <dim>-1</dim>
740
+ <dim>128</dim>
741
+ </port>
742
+ <port id="1" precision="I64">
743
+ <dim>4</dim>
744
+ </port>
745
+ </input>
746
+ <output>
747
+ <port id="2" precision="FP32" names="142,x.11">
748
+ <dim>-1</dim>
749
+ <dim>-1</dim>
750
+ <dim>2</dim>
751
+ <dim>64</dim>
752
+ </port>
753
+ </output>
754
+ </layer>
755
+ <layer id="54" name="Constant_288" type="Const" version="opset1">
756
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
757
+ <output>
758
+ <port id="0" precision="I64" names="143">
759
+ <dim>4</dim>
760
+ </port>
761
+ </output>
762
+ </layer>
763
+ <layer id="55" name="__module.encoder.layer.0.attention.self/aten::permute/Transpose_2" type="Transpose" version="opset1">
764
+ <input>
765
+ <port id="0" precision="FP32">
766
+ <dim>-1</dim>
767
+ <dim>-1</dim>
768
+ <dim>2</dim>
769
+ <dim>64</dim>
770
+ </port>
771
+ <port id="1" precision="I64">
772
+ <dim>4</dim>
773
+ </port>
774
+ </input>
775
+ <output>
776
+ <port id="2" precision="FP32" names="144">
777
+ <dim>-1</dim>
778
+ <dim>2</dim>
779
+ <dim>-1</dim>
780
+ <dim>64</dim>
781
+ </port>
782
+ </output>
783
+ </layer>
784
+ <layer id="56" name="Constant_4066" type="Const" version="opset1">
785
+ <data element_type="f32" shape="1, 1, 1, 1" offset="16093784" size="4" />
786
+ <output>
787
+ <port id="0" precision="FP32">
788
+ <dim>1</dim>
789
+ <dim>1</dim>
790
+ <dim>1</dim>
791
+ <dim>1</dim>
792
+ </port>
793
+ </output>
794
+ </layer>
795
+ <layer id="57" name="25" type="Const" version="opset1">
796
+ <data element_type="i64" shape="" offset="15894540" size="8" />
797
+ <output>
798
+ <port id="0" precision="I64" names="25" />
799
+ </output>
800
+ </layer>
801
+ <layer id="58" name="aten::unsqueeze/Unsqueeze" type="Unsqueeze" version="opset1">
802
+ <input>
803
+ <port id="0" precision="I64">
804
+ <dim>-1</dim>
805
+ <dim>-1</dim>
806
+ </port>
807
+ <port id="1" precision="I64" />
808
+ </input>
809
+ <output>
810
+ <port id="2" precision="I64" names="26">
811
+ <dim>-1</dim>
812
+ <dim>1</dim>
813
+ <dim>-1</dim>
814
+ </port>
815
+ </output>
816
+ </layer>
817
+ <layer id="59" name="27" type="Const" version="opset1">
818
+ <data element_type="i64" shape="" offset="16093788" size="8" />
819
+ <output>
820
+ <port id="0" precision="I64" names="27" />
821
+ </output>
822
+ </layer>
823
+ <layer id="60" name="aten::unsqueeze/Unsqueeze_1" type="Unsqueeze" version="opset1">
824
+ <input>
825
+ <port id="0" precision="I64">
826
+ <dim>-1</dim>
827
+ <dim>1</dim>
828
+ <dim>-1</dim>
829
+ </port>
830
+ <port id="1" precision="I64" />
831
+ </input>
832
+ <output>
833
+ <port id="2" precision="I64" names="28,33">
834
+ <dim>-1</dim>
835
+ <dim>1</dim>
836
+ <dim>1</dim>
837
+ <dim>-1</dim>
838
+ </port>
839
+ </output>
840
+ </layer>
841
+ <layer id="61" name="ShapeOf_4125" type="ShapeOf" version="opset3">
842
+ <data output_type="i64" />
843
+ <input>
844
+ <port id="0" precision="I64">
845
+ <dim>-1</dim>
846
+ <dim>-1</dim>
847
+ </port>
848
+ </input>
849
+ <output>
850
+ <port id="1" precision="I64">
851
+ <dim>2</dim>
852
+ </port>
853
+ </output>
854
+ </layer>
855
+ <layer id="62" name="Constant_4241" type="Const" version="opset1">
856
+ <data element_type="i64" shape="1" offset="15894532" size="8" />
857
+ <output>
858
+ <port id="0" precision="I64">
859
+ <dim>1</dim>
860
+ </port>
861
+ </output>
862
+ </layer>
863
+ <layer id="63" name="Constant_4127" type="Const" version="opset1">
864
+ <data element_type="i64" shape="" offset="15894532" size="8" />
865
+ <output>
866
+ <port id="0" precision="I64" />
867
+ </output>
868
+ </layer>
869
+ <layer id="64" name="Gather_4128" type="Gather" version="opset8">
870
+ <data batch_dims="0" />
871
+ <input>
872
+ <port id="0" precision="I64">
873
+ <dim>2</dim>
874
+ </port>
875
+ <port id="1" precision="I64">
876
+ <dim>1</dim>
877
+ </port>
878
+ <port id="2" precision="I64" />
879
+ </input>
880
+ <output>
881
+ <port id="3" precision="I64" names="13,15">
882
+ <dim>1</dim>
883
+ </port>
884
+ </output>
885
+ </layer>
886
+ <layer id="65" name="Constant_3694" type="Const" version="opset1">
887
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
888
+ <output>
889
+ <port id="0" precision="I64">
890
+ <dim>1</dim>
891
+ </port>
892
+ </output>
893
+ </layer>
894
+ <layer id="66" name="Constant_4244" type="Const" version="opset1">
895
+ <data element_type="i64" shape="1" offset="15894540" size="8" />
896
+ <output>
897
+ <port id="0" precision="I64">
898
+ <dim>1</dim>
899
+ </port>
900
+ </output>
901
+ </layer>
902
+ <layer id="67" name="Constant_4135" type="Const" version="opset1">
903
+ <data element_type="i64" shape="" offset="15894532" size="8" />
904
+ <output>
905
+ <port id="0" precision="I64" />
906
+ </output>
907
+ </layer>
908
+ <layer id="68" name="Gather_4136" type="Gather" version="opset8">
909
+ <data batch_dims="0" />
910
+ <input>
911
+ <port id="0" precision="I64">
912
+ <dim>2</dim>
913
+ </port>
914
+ <port id="1" precision="I64">
915
+ <dim>1</dim>
916
+ </port>
917
+ <port id="2" precision="I64" />
918
+ </input>
919
+ <output>
920
+ <port id="3" precision="I64" names="17,19">
921
+ <dim>1</dim>
922
+ </port>
923
+ </output>
924
+ </layer>
925
+ <layer id="69" name="prim::ListConstruct/Concat" type="Concat" version="opset1">
926
+ <data axis="0" />
927
+ <input>
928
+ <port id="0" precision="I64">
929
+ <dim>1</dim>
930
+ </port>
931
+ <port id="1" precision="I64">
932
+ <dim>1</dim>
933
+ </port>
934
+ <port id="2" precision="I64">
935
+ <dim>1</dim>
936
+ </port>
937
+ <port id="3" precision="I64">
938
+ <dim>1</dim>
939
+ </port>
940
+ </input>
941
+ <output>
942
+ <port id="4" precision="I64" names="35">
943
+ <dim>4</dim>
944
+ </port>
945
+ </output>
946
+ </layer>
947
+ <layer id="70" name="aten::expand/Broadcast" type="Broadcast" version="opset3">
948
+ <data mode="bidirectional" />
949
+ <input>
950
+ <port id="0" precision="I64">
951
+ <dim>-1</dim>
952
+ <dim>1</dim>
953
+ <dim>1</dim>
954
+ <dim>-1</dim>
955
+ </port>
956
+ <port id="1" precision="I64">
957
+ <dim>4</dim>
958
+ </port>
959
+ </input>
960
+ <output>
961
+ <port id="2" precision="I64" names="37">
962
+ <dim>-1</dim>
963
+ <dim>1</dim>
964
+ <dim>-1</dim>
965
+ <dim>-1</dim>
966
+ </port>
967
+ </output>
968
+ </layer>
969
+ <layer id="71" name="aten::to/Convert" type="Convert" version="opset1">
970
+ <data destination_type="f32" />
971
+ <input>
972
+ <port id="0" precision="I64">
973
+ <dim>-1</dim>
974
+ <dim>1</dim>
975
+ <dim>-1</dim>
976
+ <dim>-1</dim>
977
+ </port>
978
+ </input>
979
+ <output>
980
+ <port id="1" precision="FP32" names="42">
981
+ <dim>-1</dim>
982
+ <dim>1</dim>
983
+ <dim>-1</dim>
984
+ <dim>-1</dim>
985
+ </port>
986
+ </output>
987
+ </layer>
988
+ <layer id="72" name="Constant_4065" type="Const" version="opset1">
989
+ <data element_type="f32" shape="1, 1, 1, 1" offset="16093784" size="4" />
990
+ <output>
991
+ <port id="0" precision="FP32">
992
+ <dim>1</dim>
993
+ <dim>1</dim>
994
+ <dim>1</dim>
995
+ <dim>1</dim>
996
+ </port>
997
+ </output>
998
+ </layer>
999
+ <layer id="73" name="aten::rsub/Multiply" type="Multiply" version="opset1">
1000
+ <data auto_broadcast="numpy" />
1001
+ <input>
1002
+ <port id="0" precision="FP32">
1003
+ <dim>-1</dim>
1004
+ <dim>1</dim>
1005
+ <dim>-1</dim>
1006
+ <dim>-1</dim>
1007
+ </port>
1008
+ <port id="1" precision="FP32">
1009
+ <dim>1</dim>
1010
+ <dim>1</dim>
1011
+ <dim>1</dim>
1012
+ <dim>1</dim>
1013
+ </port>
1014
+ </input>
1015
+ <output>
1016
+ <port id="2" precision="FP32">
1017
+ <dim>-1</dim>
1018
+ <dim>1</dim>
1019
+ <dim>-1</dim>
1020
+ <dim>-1</dim>
1021
+ </port>
1022
+ </output>
1023
+ </layer>
1024
+ <layer id="74" name="aten::rsub/Subtract" type="Subtract" version="opset1">
1025
+ <data auto_broadcast="numpy" />
1026
+ <input>
1027
+ <port id="0" precision="FP32">
1028
+ <dim>1</dim>
1029
+ <dim>1</dim>
1030
+ <dim>1</dim>
1031
+ <dim>1</dim>
1032
+ </port>
1033
+ <port id="1" precision="FP32">
1034
+ <dim>-1</dim>
1035
+ <dim>1</dim>
1036
+ <dim>-1</dim>
1037
+ <dim>-1</dim>
1038
+ </port>
1039
+ </input>
1040
+ <output>
1041
+ <port id="2" precision="FP32" names="45,inverted_mask">
1042
+ <dim>-1</dim>
1043
+ <dim>1</dim>
1044
+ <dim>-1</dim>
1045
+ <dim>-1</dim>
1046
+ </port>
1047
+ </output>
1048
+ </layer>
1049
+ <layer id="75" name="aten::to/Convert_1" type="Convert" version="opset1">
1050
+ <data destination_type="boolean" />
1051
+ <input>
1052
+ <port id="0" precision="FP32">
1053
+ <dim>-1</dim>
1054
+ <dim>1</dim>
1055
+ <dim>-1</dim>
1056
+ <dim>-1</dim>
1057
+ </port>
1058
+ </input>
1059
+ <output>
1060
+ <port id="1" precision="BOOL" names="50">
1061
+ <dim>-1</dim>
1062
+ <dim>1</dim>
1063
+ <dim>-1</dim>
1064
+ <dim>-1</dim>
1065
+ </port>
1066
+ </output>
1067
+ </layer>
1068
+ <layer id="76" name="aten::masked_fill/ConvertLike" type="Const" version="opset1">
1069
+ <data element_type="f32" shape="" offset="16093796" size="4" />
1070
+ <output>
1071
+ <port id="0" precision="FP32" />
1072
+ </output>
1073
+ </layer>
1074
+ <layer id="77" name="aten::masked_fill/Select" type="Select" version="opset1">
1075
+ <data auto_broadcast="numpy" />
1076
+ <input>
1077
+ <port id="0" precision="BOOL">
1078
+ <dim>-1</dim>
1079
+ <dim>1</dim>
1080
+ <dim>-1</dim>
1081
+ <dim>-1</dim>
1082
+ </port>
1083
+ <port id="1" precision="FP32" />
1084
+ <port id="2" precision="FP32">
1085
+ <dim>-1</dim>
1086
+ <dim>1</dim>
1087
+ <dim>-1</dim>
1088
+ <dim>-1</dim>
1089
+ </port>
1090
+ </input>
1091
+ <output>
1092
+ <port id="3" precision="FP32" names="52">
1093
+ <dim>-1</dim>
1094
+ <dim>1</dim>
1095
+ <dim>-1</dim>
1096
+ <dim>-1</dim>
1097
+ </port>
1098
+ </output>
1099
+ </layer>
1100
+ <layer id="78" name="__module.encoder.layer.0.attention.self/aten::scaled_dot_product_attention/ScaledDotProductAttention" type="ScaledDotProductAttention" version="opset13">
1101
+ <data causal="false" />
1102
+ <input>
1103
+ <port id="0" precision="FP32">
1104
+ <dim>-1</dim>
1105
+ <dim>2</dim>
1106
+ <dim>-1</dim>
1107
+ <dim>64</dim>
1108
+ </port>
1109
+ <port id="1" precision="FP32">
1110
+ <dim>-1</dim>
1111
+ <dim>2</dim>
1112
+ <dim>-1</dim>
1113
+ <dim>64</dim>
1114
+ </port>
1115
+ <port id="2" precision="FP32">
1116
+ <dim>-1</dim>
1117
+ <dim>2</dim>
1118
+ <dim>-1</dim>
1119
+ <dim>64</dim>
1120
+ </port>
1121
+ <port id="3" precision="FP32">
1122
+ <dim>-1</dim>
1123
+ <dim>1</dim>
1124
+ <dim>-1</dim>
1125
+ <dim>-1</dim>
1126
+ </port>
1127
+ </input>
1128
+ <output>
1129
+ <port id="4" precision="FP32" names="145,attn_output.1">
1130
+ <dim>-1</dim>
1131
+ <dim>2</dim>
1132
+ <dim>-1</dim>
1133
+ <dim>64</dim>
1134
+ </port>
1135
+ </output>
1136
+ </layer>
1137
+ <layer id="79" name="__module.encoder.layer.0.attention.self/aten::transpose/ScatterElementsUpdate" type="Const" version="opset1">
1138
+ <data element_type="i32" shape="4" offset="16093800" size="16" />
1139
+ <output>
1140
+ <port id="0" precision="I32">
1141
+ <dim>4</dim>
1142
+ </port>
1143
+ </output>
1144
+ </layer>
1145
+ <layer id="80" name="__module.encoder.layer.0.attention.self/aten::transpose/Transpose" type="Transpose" version="opset1">
1146
+ <input>
1147
+ <port id="0" precision="FP32">
1148
+ <dim>-1</dim>
1149
+ <dim>2</dim>
1150
+ <dim>-1</dim>
1151
+ <dim>64</dim>
1152
+ </port>
1153
+ <port id="1" precision="I32">
1154
+ <dim>4</dim>
1155
+ </port>
1156
+ </input>
1157
+ <output>
1158
+ <port id="2" precision="FP32" names="146,attn_output.3">
1159
+ <dim>-1</dim>
1160
+ <dim>-1</dim>
1161
+ <dim>2</dim>
1162
+ <dim>64</dim>
1163
+ </port>
1164
+ </output>
1165
+ </layer>
1166
+ <layer id="81" name="__module.encoder.layer.0.attention.self/aten::size/ShapeOf_6" type="ShapeOf" version="opset3">
1167
+ <data output_type="i64" />
1168
+ <input>
1169
+ <port id="0" precision="FP32">
1170
+ <dim>-1</dim>
1171
+ <dim>-1</dim>
1172
+ <dim>128</dim>
1173
+ </port>
1174
+ </input>
1175
+ <output>
1176
+ <port id="1" precision="I64">
1177
+ <dim>3</dim>
1178
+ </port>
1179
+ </output>
1180
+ </layer>
1181
+ <layer id="82" name="Constant_3859" type="Const" version="opset1">
1182
+ <data element_type="i64" shape="2" offset="16093816" size="16" />
1183
+ <output>
1184
+ <port id="0" precision="I64">
1185
+ <dim>2</dim>
1186
+ </port>
1187
+ </output>
1188
+ </layer>
1189
+ <layer id="83" name="Constant_3860" type="Const" version="opset1">
1190
+ <data element_type="i64" shape="" offset="15894532" size="8" />
1191
+ <output>
1192
+ <port id="0" precision="I64" />
1193
+ </output>
1194
+ </layer>
1195
+ <layer id="84" name="Gather_3861" type="Gather" version="opset8">
1196
+ <data batch_dims="0" />
1197
+ <input>
1198
+ <port id="0" precision="I64">
1199
+ <dim>3</dim>
1200
+ </port>
1201
+ <port id="1" precision="I64">
1202
+ <dim>2</dim>
1203
+ </port>
1204
+ <port id="2" precision="I64" />
1205
+ </input>
1206
+ <output>
1207
+ <port id="3" precision="I64">
1208
+ <dim>2</dim>
1209
+ </port>
1210
+ </output>
1211
+ </layer>
1212
+ <layer id="85" name="__module.encoder.layer.0.attention.self/prim::ListConstruct/Reshape_1_3" type="Const" version="opset1">
1213
+ <data element_type="i64" shape="1" offset="16093832" size="8" />
1214
+ <output>
1215
+ <port id="0" precision="I64">
1216
+ <dim>1</dim>
1217
+ </port>
1218
+ </output>
1219
+ </layer>
1220
+ <layer id="86" name="__module.encoder.layer.0.attention.self/prim::ListConstruct/Concat_3" type="Concat" version="opset1">
1221
+ <data axis="0" />
1222
+ <input>
1223
+ <port id="0" precision="I64">
1224
+ <dim>2</dim>
1225
+ </port>
1226
+ <port id="1" precision="I64">
1227
+ <dim>1</dim>
1228
+ </port>
1229
+ </input>
1230
+ <output>
1231
+ <port id="2" precision="I64" names="147">
1232
+ <dim>3</dim>
1233
+ </port>
1234
+ </output>
1235
+ </layer>
1236
+ <layer id="87" name="__module.encoder.layer.0.attention.self/aten::reshape/Reshape" type="Reshape" version="opset1">
1237
+ <data special_zero="false" />
1238
+ <input>
1239
+ <port id="0" precision="FP32">
1240
+ <dim>-1</dim>
1241
+ <dim>-1</dim>
1242
+ <dim>2</dim>
1243
+ <dim>64</dim>
1244
+ </port>
1245
+ <port id="1" precision="I64">
1246
+ <dim>3</dim>
1247
+ </port>
1248
+ </input>
1249
+ <output>
1250
+ <port id="2" precision="FP32" names="148">
1251
+ <dim>-1</dim>
1252
+ <dim>-1</dim>
1253
+ <dim>128</dim>
1254
+ </port>
1255
+ </output>
1256
+ </layer>
1257
+ <layer id="88" name="self.encoder.layer.0.attention.output.dense.weight" type="Const" version="opset1">
1258
+ <data element_type="f32" shape="128, 128" offset="16093840" size="65536" />
1259
+ <output>
1260
+ <port id="0" precision="FP32" names="self.encoder.layer.0.attention.output.dense.weight">
1261
+ <dim>128</dim>
1262
+ <dim>128</dim>
1263
+ </port>
1264
+ </output>
1265
+ </layer>
1266
+ <layer id="89" name="__module.encoder.layer.0.attention.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1267
+ <data transpose_a="false" transpose_b="true" />
1268
+ <input>
1269
+ <port id="0" precision="FP32">
1270
+ <dim>-1</dim>
1271
+ <dim>-1</dim>
1272
+ <dim>128</dim>
1273
+ </port>
1274
+ <port id="1" precision="FP32">
1275
+ <dim>128</dim>
1276
+ <dim>128</dim>
1277
+ </port>
1278
+ </input>
1279
+ <output>
1280
+ <port id="2" precision="FP32">
1281
+ <dim>-1</dim>
1282
+ <dim>-1</dim>
1283
+ <dim>128</dim>
1284
+ </port>
1285
+ </output>
1286
+ </layer>
1287
+ <layer id="90" name="Constant_4067" type="Const" version="opset1">
1288
+ <data element_type="f32" shape="1, 1, 128" offset="16159376" size="512" />
1289
+ <output>
1290
+ <port id="0" precision="FP32">
1291
+ <dim>1</dim>
1292
+ <dim>1</dim>
1293
+ <dim>128</dim>
1294
+ </port>
1295
+ </output>
1296
+ </layer>
1297
+ <layer id="91" name="__module.encoder.layer.0.attention.output.dense/aten::linear/Add" type="Add" version="opset1">
1298
+ <data auto_broadcast="numpy" />
1299
+ <input>
1300
+ <port id="0" precision="FP32">
1301
+ <dim>-1</dim>
1302
+ <dim>-1</dim>
1303
+ <dim>128</dim>
1304
+ </port>
1305
+ <port id="1" precision="FP32">
1306
+ <dim>1</dim>
1307
+ <dim>1</dim>
1308
+ <dim>128</dim>
1309
+ </port>
1310
+ </input>
1311
+ <output>
1312
+ <port id="2" precision="FP32" names="154,input.3">
1313
+ <dim>-1</dim>
1314
+ <dim>-1</dim>
1315
+ <dim>128</dim>
1316
+ </port>
1317
+ </output>
1318
+ </layer>
1319
+ <layer id="92" name="__module.encoder.layer.0.attention.output/aten::add/Add" type="Add" version="opset1">
1320
+ <data auto_broadcast="numpy" />
1321
+ <input>
1322
+ <port id="0" precision="FP32">
1323
+ <dim>-1</dim>
1324
+ <dim>-1</dim>
1325
+ <dim>128</dim>
1326
+ </port>
1327
+ <port id="1" precision="FP32">
1328
+ <dim>-1</dim>
1329
+ <dim>-1</dim>
1330
+ <dim>128</dim>
1331
+ </port>
1332
+ </input>
1333
+ <output>
1334
+ <port id="2" precision="FP32" names="156">
1335
+ <dim>-1</dim>
1336
+ <dim>-1</dim>
1337
+ <dim>128</dim>
1338
+ </port>
1339
+ </output>
1340
+ </layer>
1341
+ <layer id="93" name="__module.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
1342
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
1343
+ <output>
1344
+ <port id="0" precision="I32">
1345
+ <dim>1</dim>
1346
+ </port>
1347
+ </output>
1348
+ </layer>
1349
+ <layer id="94" name="__module.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
1350
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
1351
+ <input>
1352
+ <port id="0" precision="FP32">
1353
+ <dim>-1</dim>
1354
+ <dim>-1</dim>
1355
+ <dim>128</dim>
1356
+ </port>
1357
+ <port id="1" precision="I32">
1358
+ <dim>1</dim>
1359
+ </port>
1360
+ </input>
1361
+ <output>
1362
+ <port id="2" precision="FP32">
1363
+ <dim>-1</dim>
1364
+ <dim>-1</dim>
1365
+ <dim>128</dim>
1366
+ </port>
1367
+ </output>
1368
+ </layer>
1369
+ <layer id="95" name="Constant_4068" type="Const" version="opset1">
1370
+ <data element_type="f32" shape="1, 1, 128" offset="16159888" size="512" />
1371
+ <output>
1372
+ <port id="0" precision="FP32">
1373
+ <dim>1</dim>
1374
+ <dim>1</dim>
1375
+ <dim>128</dim>
1376
+ </port>
1377
+ </output>
1378
+ </layer>
1379
+ <layer id="96" name="__module.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
1380
+ <data auto_broadcast="numpy" />
1381
+ <input>
1382
+ <port id="0" precision="FP32">
1383
+ <dim>-1</dim>
1384
+ <dim>-1</dim>
1385
+ <dim>128</dim>
1386
+ </port>
1387
+ <port id="1" precision="FP32">
1388
+ <dim>1</dim>
1389
+ <dim>1</dim>
1390
+ <dim>128</dim>
1391
+ </port>
1392
+ </input>
1393
+ <output>
1394
+ <port id="2" precision="FP32">
1395
+ <dim>-1</dim>
1396
+ <dim>-1</dim>
1397
+ <dim>128</dim>
1398
+ </port>
1399
+ </output>
1400
+ </layer>
1401
+ <layer id="97" name="Constant_4069" type="Const" version="opset1">
1402
+ <data element_type="f32" shape="1, 1, 128" offset="16160400" size="512" />
1403
+ <output>
1404
+ <port id="0" precision="FP32">
1405
+ <dim>1</dim>
1406
+ <dim>1</dim>
1407
+ <dim>128</dim>
1408
+ </port>
1409
+ </output>
1410
+ </layer>
1411
+ <layer id="98" name="__module.encoder.layer.0.attention.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
1412
+ <data auto_broadcast="numpy" />
1413
+ <input>
1414
+ <port id="0" precision="FP32">
1415
+ <dim>-1</dim>
1416
+ <dim>-1</dim>
1417
+ <dim>128</dim>
1418
+ </port>
1419
+ <port id="1" precision="FP32">
1420
+ <dim>1</dim>
1421
+ <dim>1</dim>
1422
+ <dim>128</dim>
1423
+ </port>
1424
+ </input>
1425
+ <output>
1426
+ <port id="2" precision="FP32" names="160,input_tensor.1">
1427
+ <dim>-1</dim>
1428
+ <dim>-1</dim>
1429
+ <dim>128</dim>
1430
+ </port>
1431
+ </output>
1432
+ </layer>
1433
+ <layer id="99" name="self.encoder.layer.0.intermediate.dense.weight" type="Const" version="opset1">
1434
+ <data element_type="f32" shape="512, 128" offset="16160912" size="262144" />
1435
+ <output>
1436
+ <port id="0" precision="FP32" names="self.encoder.layer.0.intermediate.dense.weight">
1437
+ <dim>512</dim>
1438
+ <dim>128</dim>
1439
+ </port>
1440
+ </output>
1441
+ </layer>
1442
+ <layer id="100" name="__module.encoder.layer.0.intermediate.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1443
+ <data transpose_a="false" transpose_b="true" />
1444
+ <input>
1445
+ <port id="0" precision="FP32">
1446
+ <dim>-1</dim>
1447
+ <dim>-1</dim>
1448
+ <dim>128</dim>
1449
+ </port>
1450
+ <port id="1" precision="FP32">
1451
+ <dim>512</dim>
1452
+ <dim>128</dim>
1453
+ </port>
1454
+ </input>
1455
+ <output>
1456
+ <port id="2" precision="FP32">
1457
+ <dim>-1</dim>
1458
+ <dim>-1</dim>
1459
+ <dim>512</dim>
1460
+ </port>
1461
+ </output>
1462
+ </layer>
1463
+ <layer id="101" name="Constant_4070" type="Const" version="opset1">
1464
+ <data element_type="f32" shape="1, 1, 512" offset="16423056" size="2048" />
1465
+ <output>
1466
+ <port id="0" precision="FP32">
1467
+ <dim>1</dim>
1468
+ <dim>1</dim>
1469
+ <dim>512</dim>
1470
+ </port>
1471
+ </output>
1472
+ </layer>
1473
+ <layer id="102" name="__module.encoder.layer.0.intermediate.dense/aten::linear/Add" type="Add" version="opset1">
1474
+ <data auto_broadcast="numpy" />
1475
+ <input>
1476
+ <port id="0" precision="FP32">
1477
+ <dim>-1</dim>
1478
+ <dim>-1</dim>
1479
+ <dim>512</dim>
1480
+ </port>
1481
+ <port id="1" precision="FP32">
1482
+ <dim>1</dim>
1483
+ <dim>1</dim>
1484
+ <dim>512</dim>
1485
+ </port>
1486
+ </input>
1487
+ <output>
1488
+ <port id="2" precision="FP32" names="165">
1489
+ <dim>-1</dim>
1490
+ <dim>-1</dim>
1491
+ <dim>512</dim>
1492
+ </port>
1493
+ </output>
1494
+ </layer>
1495
+ <layer id="103" name="__module.encoder.layer.0.intermediate.intermediate_act_fn/aten::gelu/Gelu" type="Gelu" version="opset7">
1496
+ <data approximation_mode="ERF" />
1497
+ <input>
1498
+ <port id="0" precision="FP32">
1499
+ <dim>-1</dim>
1500
+ <dim>-1</dim>
1501
+ <dim>512</dim>
1502
+ </port>
1503
+ </input>
1504
+ <output>
1505
+ <port id="1" precision="FP32" names="166">
1506
+ <dim>-1</dim>
1507
+ <dim>-1</dim>
1508
+ <dim>512</dim>
1509
+ </port>
1510
+ </output>
1511
+ </layer>
1512
+ <layer id="104" name="self.encoder.layer.0.output.dense.weight" type="Const" version="opset1">
1513
+ <data element_type="f32" shape="128, 512" offset="16425104" size="262144" />
1514
+ <output>
1515
+ <port id="0" precision="FP32" names="self.encoder.layer.0.output.dense.weight">
1516
+ <dim>128</dim>
1517
+ <dim>512</dim>
1518
+ </port>
1519
+ </output>
1520
+ </layer>
1521
+ <layer id="105" name="__module.encoder.layer.0.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
1522
+ <data transpose_a="false" transpose_b="true" />
1523
+ <input>
1524
+ <port id="0" precision="FP32">
1525
+ <dim>-1</dim>
1526
+ <dim>-1</dim>
1527
+ <dim>512</dim>
1528
+ </port>
1529
+ <port id="1" precision="FP32">
1530
+ <dim>128</dim>
1531
+ <dim>512</dim>
1532
+ </port>
1533
+ </input>
1534
+ <output>
1535
+ <port id="2" precision="FP32">
1536
+ <dim>-1</dim>
1537
+ <dim>-1</dim>
1538
+ <dim>128</dim>
1539
+ </port>
1540
+ </output>
1541
+ </layer>
1542
+ <layer id="106" name="Constant_4071" type="Const" version="opset1">
1543
+ <data element_type="f32" shape="1, 1, 128" offset="16687248" size="512" />
1544
+ <output>
1545
+ <port id="0" precision="FP32">
1546
+ <dim>1</dim>
1547
+ <dim>1</dim>
1548
+ <dim>128</dim>
1549
+ </port>
1550
+ </output>
1551
+ </layer>
1552
+ <layer id="107" name="__module.encoder.layer.0.output.dense/aten::linear/Add" type="Add" version="opset1">
1553
+ <data auto_broadcast="numpy" />
1554
+ <input>
1555
+ <port id="0" precision="FP32">
1556
+ <dim>-1</dim>
1557
+ <dim>-1</dim>
1558
+ <dim>128</dim>
1559
+ </port>
1560
+ <port id="1" precision="FP32">
1561
+ <dim>1</dim>
1562
+ <dim>1</dim>
1563
+ <dim>128</dim>
1564
+ </port>
1565
+ </input>
1566
+ <output>
1567
+ <port id="2" precision="FP32" names="172,input.5">
1568
+ <dim>-1</dim>
1569
+ <dim>-1</dim>
1570
+ <dim>128</dim>
1571
+ </port>
1572
+ </output>
1573
+ </layer>
1574
+ <layer id="108" name="__module.encoder.layer.0.output/aten::add/Add" type="Add" version="opset1">
1575
+ <data auto_broadcast="numpy" />
1576
+ <input>
1577
+ <port id="0" precision="FP32">
1578
+ <dim>-1</dim>
1579
+ <dim>-1</dim>
1580
+ <dim>128</dim>
1581
+ </port>
1582
+ <port id="1" precision="FP32">
1583
+ <dim>-1</dim>
1584
+ <dim>-1</dim>
1585
+ <dim>128</dim>
1586
+ </port>
1587
+ </input>
1588
+ <output>
1589
+ <port id="2" precision="FP32" names="174">
1590
+ <dim>-1</dim>
1591
+ <dim>-1</dim>
1592
+ <dim>128</dim>
1593
+ </port>
1594
+ </output>
1595
+ </layer>
1596
+ <layer id="109" name="__module.encoder.layer.0.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
1597
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
1598
+ <output>
1599
+ <port id="0" precision="I32">
1600
+ <dim>1</dim>
1601
+ </port>
1602
+ </output>
1603
+ </layer>
1604
+ <layer id="110" name="__module.encoder.layer.0.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
1605
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
1606
+ <input>
1607
+ <port id="0" precision="FP32">
1608
+ <dim>-1</dim>
1609
+ <dim>-1</dim>
1610
+ <dim>128</dim>
1611
+ </port>
1612
+ <port id="1" precision="I32">
1613
+ <dim>1</dim>
1614
+ </port>
1615
+ </input>
1616
+ <output>
1617
+ <port id="2" precision="FP32">
1618
+ <dim>-1</dim>
1619
+ <dim>-1</dim>
1620
+ <dim>128</dim>
1621
+ </port>
1622
+ </output>
1623
+ </layer>
1624
+ <layer id="111" name="Constant_4072" type="Const" version="opset1">
1625
+ <data element_type="f32" shape="1, 1, 128" offset="16687760" size="512" />
1626
+ <output>
1627
+ <port id="0" precision="FP32">
1628
+ <dim>1</dim>
1629
+ <dim>1</dim>
1630
+ <dim>128</dim>
1631
+ </port>
1632
+ </output>
1633
+ </layer>
1634
+ <layer id="112" name="__module.encoder.layer.0.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
1635
+ <data auto_broadcast="numpy" />
1636
+ <input>
1637
+ <port id="0" precision="FP32">
1638
+ <dim>-1</dim>
1639
+ <dim>-1</dim>
1640
+ <dim>128</dim>
1641
+ </port>
1642
+ <port id="1" precision="FP32">
1643
+ <dim>1</dim>
1644
+ <dim>1</dim>
1645
+ <dim>128</dim>
1646
+ </port>
1647
+ </input>
1648
+ <output>
1649
+ <port id="2" precision="FP32">
1650
+ <dim>-1</dim>
1651
+ <dim>-1</dim>
1652
+ <dim>128</dim>
1653
+ </port>
1654
+ </output>
1655
+ </layer>
1656
+ <layer id="113" name="Constant_4073" type="Const" version="opset1">
1657
+ <data element_type="f32" shape="1, 1, 128" offset="16688272" size="512" />
1658
+ <output>
1659
+ <port id="0" precision="FP32">
1660
+ <dim>1</dim>
1661
+ <dim>1</dim>
1662
+ <dim>128</dim>
1663
+ </port>
1664
+ </output>
1665
+ </layer>
1666
+ <layer id="114" name="__module.encoder.layer.0.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
1667
+ <data auto_broadcast="numpy" />
1668
+ <input>
1669
+ <port id="0" precision="FP32">
1670
+ <dim>-1</dim>
1671
+ <dim>-1</dim>
1672
+ <dim>128</dim>
1673
+ </port>
1674
+ <port id="1" precision="FP32">
1675
+ <dim>1</dim>
1676
+ <dim>1</dim>
1677
+ <dim>128</dim>
1678
+ </port>
1679
+ </input>
1680
+ <output>
1681
+ <port id="2" precision="FP32" names="178,hidden_states.7">
1682
+ <dim>-1</dim>
1683
+ <dim>-1</dim>
1684
+ <dim>128</dim>
1685
+ </port>
1686
+ </output>
1687
+ </layer>
1688
+ <layer id="115" name="self.encoder.layer.1.attention.self.query.weight" type="Const" version="opset1">
1689
+ <data element_type="f32" shape="128, 128" offset="16688784" size="65536" />
1690
+ <output>
1691
+ <port id="0" precision="FP32" names="self.encoder.layer.1.attention.self.query.weight">
1692
+ <dim>128</dim>
1693
+ <dim>128</dim>
1694
+ </port>
1695
+ </output>
1696
+ </layer>
1697
+ <layer id="116" name="__module.encoder.layer.1.attention.self.query/aten::linear/MatMul" type="MatMul" version="opset1">
1698
+ <data transpose_a="false" transpose_b="true" />
1699
+ <input>
1700
+ <port id="0" precision="FP32">
1701
+ <dim>-1</dim>
1702
+ <dim>-1</dim>
1703
+ <dim>128</dim>
1704
+ </port>
1705
+ <port id="1" precision="FP32">
1706
+ <dim>128</dim>
1707
+ <dim>128</dim>
1708
+ </port>
1709
+ </input>
1710
+ <output>
1711
+ <port id="2" precision="FP32">
1712
+ <dim>-1</dim>
1713
+ <dim>-1</dim>
1714
+ <dim>128</dim>
1715
+ </port>
1716
+ </output>
1717
+ </layer>
1718
+ <layer id="117" name="Constant_4074" type="Const" version="opset1">
1719
+ <data element_type="f32" shape="1, 1, 128" offset="16754320" size="512" />
1720
+ <output>
1721
+ <port id="0" precision="FP32">
1722
+ <dim>1</dim>
1723
+ <dim>1</dim>
1724
+ <dim>128</dim>
1725
+ </port>
1726
+ </output>
1727
+ </layer>
1728
+ <layer id="118" name="__module.encoder.layer.1.attention.self.query/aten::linear/Add" type="Add" version="opset1">
1729
+ <data auto_broadcast="numpy" />
1730
+ <input>
1731
+ <port id="0" precision="FP32">
1732
+ <dim>-1</dim>
1733
+ <dim>-1</dim>
1734
+ <dim>128</dim>
1735
+ </port>
1736
+ <port id="1" precision="FP32">
1737
+ <dim>1</dim>
1738
+ <dim>1</dim>
1739
+ <dim>128</dim>
1740
+ </port>
1741
+ </input>
1742
+ <output>
1743
+ <port id="2" precision="FP32" names="191,x.13">
1744
+ <dim>-1</dim>
1745
+ <dim>-1</dim>
1746
+ <dim>128</dim>
1747
+ </port>
1748
+ </output>
1749
+ </layer>
1750
+ <layer id="119" name="__module.encoder.layer.1.attention.self/prim::ListConstruct/Concat" type="Const" version="opset1">
1751
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1752
+ <output>
1753
+ <port id="0" precision="I64">
1754
+ <dim>4</dim>
1755
+ </port>
1756
+ </output>
1757
+ </layer>
1758
+ <layer id="120" name="__module.encoder.layer.1.attention.self/aten::view/Reshape" type="Reshape" version="opset1">
1759
+ <data special_zero="true" />
1760
+ <input>
1761
+ <port id="0" precision="FP32">
1762
+ <dim>-1</dim>
1763
+ <dim>-1</dim>
1764
+ <dim>128</dim>
1765
+ </port>
1766
+ <port id="1" precision="I64">
1767
+ <dim>4</dim>
1768
+ </port>
1769
+ </input>
1770
+ <output>
1771
+ <port id="2" precision="FP32" names="195,x.15">
1772
+ <dim>-1</dim>
1773
+ <dim>-1</dim>
1774
+ <dim>2</dim>
1775
+ <dim>64</dim>
1776
+ </port>
1777
+ </output>
1778
+ </layer>
1779
+ <layer id="121" name="Constant_446" type="Const" version="opset1">
1780
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
1781
+ <output>
1782
+ <port id="0" precision="I64" names="196">
1783
+ <dim>4</dim>
1784
+ </port>
1785
+ </output>
1786
+ </layer>
1787
+ <layer id="122" name="__module.encoder.layer.1.attention.self/aten::permute/Transpose" type="Transpose" version="opset1">
1788
+ <input>
1789
+ <port id="0" precision="FP32">
1790
+ <dim>-1</dim>
1791
+ <dim>-1</dim>
1792
+ <dim>2</dim>
1793
+ <dim>64</dim>
1794
+ </port>
1795
+ <port id="1" precision="I64">
1796
+ <dim>4</dim>
1797
+ </port>
1798
+ </input>
1799
+ <output>
1800
+ <port id="2" precision="FP32" names="197">
1801
+ <dim>-1</dim>
1802
+ <dim>2</dim>
1803
+ <dim>-1</dim>
1804
+ <dim>64</dim>
1805
+ </port>
1806
+ </output>
1807
+ </layer>
1808
+ <layer id="123" name="self.encoder.layer.1.attention.self.key.weight" type="Const" version="opset1">
1809
+ <data element_type="f32" shape="128, 128" offset="16754832" size="65536" />
1810
+ <output>
1811
+ <port id="0" precision="FP32" names="self.encoder.layer.1.attention.self.key.weight">
1812
+ <dim>128</dim>
1813
+ <dim>128</dim>
1814
+ </port>
1815
+ </output>
1816
+ </layer>
1817
+ <layer id="124" name="__module.encoder.layer.1.attention.self.key/aten::linear/MatMul" type="MatMul" version="opset1">
1818
+ <data transpose_a="false" transpose_b="true" />
1819
+ <input>
1820
+ <port id="0" precision="FP32">
1821
+ <dim>-1</dim>
1822
+ <dim>-1</dim>
1823
+ <dim>128</dim>
1824
+ </port>
1825
+ <port id="1" precision="FP32">
1826
+ <dim>128</dim>
1827
+ <dim>128</dim>
1828
+ </port>
1829
+ </input>
1830
+ <output>
1831
+ <port id="2" precision="FP32">
1832
+ <dim>-1</dim>
1833
+ <dim>-1</dim>
1834
+ <dim>128</dim>
1835
+ </port>
1836
+ </output>
1837
+ </layer>
1838
+ <layer id="125" name="Constant_4075" type="Const" version="opset1">
1839
+ <data element_type="f32" shape="1, 1, 128" offset="16820368" size="512" />
1840
+ <output>
1841
+ <port id="0" precision="FP32">
1842
+ <dim>1</dim>
1843
+ <dim>1</dim>
1844
+ <dim>128</dim>
1845
+ </port>
1846
+ </output>
1847
+ </layer>
1848
+ <layer id="126" name="__module.encoder.layer.1.attention.self.key/aten::linear/Add" type="Add" version="opset1">
1849
+ <data auto_broadcast="numpy" />
1850
+ <input>
1851
+ <port id="0" precision="FP32">
1852
+ <dim>-1</dim>
1853
+ <dim>-1</dim>
1854
+ <dim>128</dim>
1855
+ </port>
1856
+ <port id="1" precision="FP32">
1857
+ <dim>1</dim>
1858
+ <dim>1</dim>
1859
+ <dim>128</dim>
1860
+ </port>
1861
+ </input>
1862
+ <output>
1863
+ <port id="2" precision="FP32" names="200,x.17">
1864
+ <dim>-1</dim>
1865
+ <dim>-1</dim>
1866
+ <dim>128</dim>
1867
+ </port>
1868
+ </output>
1869
+ </layer>
1870
+ <layer id="127" name="__module.encoder.layer.1.attention.self/prim::ListConstruct/Concat_1" type="Const" version="opset1">
1871
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1872
+ <output>
1873
+ <port id="0" precision="I64">
1874
+ <dim>4</dim>
1875
+ </port>
1876
+ </output>
1877
+ </layer>
1878
+ <layer id="128" name="__module.encoder.layer.1.attention.self/aten::view/Reshape_1" type="Reshape" version="opset1">
1879
+ <data special_zero="true" />
1880
+ <input>
1881
+ <port id="0" precision="FP32">
1882
+ <dim>-1</dim>
1883
+ <dim>-1</dim>
1884
+ <dim>128</dim>
1885
+ </port>
1886
+ <port id="1" precision="I64">
1887
+ <dim>4</dim>
1888
+ </port>
1889
+ </input>
1890
+ <output>
1891
+ <port id="2" precision="FP32" names="204,x.19">
1892
+ <dim>-1</dim>
1893
+ <dim>-1</dim>
1894
+ <dim>2</dim>
1895
+ <dim>64</dim>
1896
+ </port>
1897
+ </output>
1898
+ </layer>
1899
+ <layer id="129" name="Constant_469" type="Const" version="opset1">
1900
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
1901
+ <output>
1902
+ <port id="0" precision="I64" names="205">
1903
+ <dim>4</dim>
1904
+ </port>
1905
+ </output>
1906
+ </layer>
1907
+ <layer id="130" name="__module.encoder.layer.1.attention.self/aten::permute/Transpose_1" type="Transpose" version="opset1">
1908
+ <input>
1909
+ <port id="0" precision="FP32">
1910
+ <dim>-1</dim>
1911
+ <dim>-1</dim>
1912
+ <dim>2</dim>
1913
+ <dim>64</dim>
1914
+ </port>
1915
+ <port id="1" precision="I64">
1916
+ <dim>4</dim>
1917
+ </port>
1918
+ </input>
1919
+ <output>
1920
+ <port id="2" precision="FP32" names="206">
1921
+ <dim>-1</dim>
1922
+ <dim>2</dim>
1923
+ <dim>-1</dim>
1924
+ <dim>64</dim>
1925
+ </port>
1926
+ </output>
1927
+ </layer>
1928
+ <layer id="131" name="self.encoder.layer.1.attention.self.value.weight" type="Const" version="opset1">
1929
+ <data element_type="f32" shape="128, 128" offset="16820880" size="65536" />
1930
+ <output>
1931
+ <port id="0" precision="FP32" names="self.encoder.layer.1.attention.self.value.weight">
1932
+ <dim>128</dim>
1933
+ <dim>128</dim>
1934
+ </port>
1935
+ </output>
1936
+ </layer>
1937
+ <layer id="132" name="__module.encoder.layer.1.attention.self.value/aten::linear/MatMul" type="MatMul" version="opset1">
1938
+ <data transpose_a="false" transpose_b="true" />
1939
+ <input>
1940
+ <port id="0" precision="FP32">
1941
+ <dim>-1</dim>
1942
+ <dim>-1</dim>
1943
+ <dim>128</dim>
1944
+ </port>
1945
+ <port id="1" precision="FP32">
1946
+ <dim>128</dim>
1947
+ <dim>128</dim>
1948
+ </port>
1949
+ </input>
1950
+ <output>
1951
+ <port id="2" precision="FP32">
1952
+ <dim>-1</dim>
1953
+ <dim>-1</dim>
1954
+ <dim>128</dim>
1955
+ </port>
1956
+ </output>
1957
+ </layer>
1958
+ <layer id="133" name="Constant_4076" type="Const" version="opset1">
1959
+ <data element_type="f32" shape="1, 1, 128" offset="16886416" size="512" />
1960
+ <output>
1961
+ <port id="0" precision="FP32">
1962
+ <dim>1</dim>
1963
+ <dim>1</dim>
1964
+ <dim>128</dim>
1965
+ </port>
1966
+ </output>
1967
+ </layer>
1968
+ <layer id="134" name="__module.encoder.layer.1.attention.self.value/aten::linear/Add" type="Add" version="opset1">
1969
+ <data auto_broadcast="numpy" />
1970
+ <input>
1971
+ <port id="0" precision="FP32">
1972
+ <dim>-1</dim>
1973
+ <dim>-1</dim>
1974
+ <dim>128</dim>
1975
+ </port>
1976
+ <port id="1" precision="FP32">
1977
+ <dim>1</dim>
1978
+ <dim>1</dim>
1979
+ <dim>128</dim>
1980
+ </port>
1981
+ </input>
1982
+ <output>
1983
+ <port id="2" precision="FP32" names="209,x.21">
1984
+ <dim>-1</dim>
1985
+ <dim>-1</dim>
1986
+ <dim>128</dim>
1987
+ </port>
1988
+ </output>
1989
+ </layer>
1990
+ <layer id="135" name="__module.encoder.layer.1.attention.self/prim::ListConstruct/Concat_2" type="Const" version="opset1">
1991
+ <data element_type="i64" shape="4" offset="15961624" size="32" />
1992
+ <output>
1993
+ <port id="0" precision="I64">
1994
+ <dim>4</dim>
1995
+ </port>
1996
+ </output>
1997
+ </layer>
1998
+ <layer id="136" name="__module.encoder.layer.1.attention.self/aten::view/Reshape_2" type="Reshape" version="opset1">
1999
+ <data special_zero="true" />
2000
+ <input>
2001
+ <port id="0" precision="FP32">
2002
+ <dim>-1</dim>
2003
+ <dim>-1</dim>
2004
+ <dim>128</dim>
2005
+ </port>
2006
+ <port id="1" precision="I64">
2007
+ <dim>4</dim>
2008
+ </port>
2009
+ </input>
2010
+ <output>
2011
+ <port id="2" precision="FP32" names="213,x">
2012
+ <dim>-1</dim>
2013
+ <dim>-1</dim>
2014
+ <dim>2</dim>
2015
+ <dim>64</dim>
2016
+ </port>
2017
+ </output>
2018
+ </layer>
2019
+ <layer id="137" name="Constant_492" type="Const" version="opset1">
2020
+ <data element_type="i64" shape="4" offset="15961656" size="32" />
2021
+ <output>
2022
+ <port id="0" precision="I64" names="214">
2023
+ <dim>4</dim>
2024
+ </port>
2025
+ </output>
2026
+ </layer>
2027
+ <layer id="138" name="__module.encoder.layer.1.attention.self/aten::permute/Transpose_2" type="Transpose" version="opset1">
2028
+ <input>
2029
+ <port id="0" precision="FP32">
2030
+ <dim>-1</dim>
2031
+ <dim>-1</dim>
2032
+ <dim>2</dim>
2033
+ <dim>64</dim>
2034
+ </port>
2035
+ <port id="1" precision="I64">
2036
+ <dim>4</dim>
2037
+ </port>
2038
+ </input>
2039
+ <output>
2040
+ <port id="2" precision="FP32" names="215">
2041
+ <dim>-1</dim>
2042
+ <dim>2</dim>
2043
+ <dim>-1</dim>
2044
+ <dim>64</dim>
2045
+ </port>
2046
+ </output>
2047
+ </layer>
2048
+ <layer id="139" name="__module.encoder.layer.1.attention.self/aten::scaled_dot_product_attention/ScaledDotProductAttention" type="ScaledDotProductAttention" version="opset13">
2049
+ <data causal="false" />
2050
+ <input>
2051
+ <port id="0" precision="FP32">
2052
+ <dim>-1</dim>
2053
+ <dim>2</dim>
2054
+ <dim>-1</dim>
2055
+ <dim>64</dim>
2056
+ </port>
2057
+ <port id="1" precision="FP32">
2058
+ <dim>-1</dim>
2059
+ <dim>2</dim>
2060
+ <dim>-1</dim>
2061
+ <dim>64</dim>
2062
+ </port>
2063
+ <port id="2" precision="FP32">
2064
+ <dim>-1</dim>
2065
+ <dim>2</dim>
2066
+ <dim>-1</dim>
2067
+ <dim>64</dim>
2068
+ </port>
2069
+ <port id="3" precision="FP32">
2070
+ <dim>-1</dim>
2071
+ <dim>1</dim>
2072
+ <dim>-1</dim>
2073
+ <dim>-1</dim>
2074
+ </port>
2075
+ </input>
2076
+ <output>
2077
+ <port id="4" precision="FP32" names="216,attn_output.5">
2078
+ <dim>-1</dim>
2079
+ <dim>2</dim>
2080
+ <dim>-1</dim>
2081
+ <dim>64</dim>
2082
+ </port>
2083
+ </output>
2084
+ </layer>
2085
+ <layer id="140" name="__module.encoder.layer.1.attention.self/aten::transpose/ScatterElementsUpdate" type="Const" version="opset1">
2086
+ <data element_type="i32" shape="4" offset="16093800" size="16" />
2087
+ <output>
2088
+ <port id="0" precision="I32">
2089
+ <dim>4</dim>
2090
+ </port>
2091
+ </output>
2092
+ </layer>
2093
+ <layer id="141" name="__module.encoder.layer.1.attention.self/aten::transpose/Transpose" type="Transpose" version="opset1">
2094
+ <input>
2095
+ <port id="0" precision="FP32">
2096
+ <dim>-1</dim>
2097
+ <dim>2</dim>
2098
+ <dim>-1</dim>
2099
+ <dim>64</dim>
2100
+ </port>
2101
+ <port id="1" precision="I32">
2102
+ <dim>4</dim>
2103
+ </port>
2104
+ </input>
2105
+ <output>
2106
+ <port id="2" precision="FP32" names="217,attn_output">
2107
+ <dim>-1</dim>
2108
+ <dim>-1</dim>
2109
+ <dim>2</dim>
2110
+ <dim>64</dim>
2111
+ </port>
2112
+ </output>
2113
+ </layer>
2114
+ <layer id="142" name="__module.encoder.layer.1.attention.self/aten::size/ShapeOf_6" type="ShapeOf" version="opset3">
2115
+ <data output_type="i64" />
2116
+ <input>
2117
+ <port id="0" precision="FP32">
2118
+ <dim>-1</dim>
2119
+ <dim>-1</dim>
2120
+ <dim>128</dim>
2121
+ </port>
2122
+ </input>
2123
+ <output>
2124
+ <port id="1" precision="I64">
2125
+ <dim>3</dim>
2126
+ </port>
2127
+ </output>
2128
+ </layer>
2129
+ <layer id="143" name="Constant_3879" type="Const" version="opset1">
2130
+ <data element_type="i64" shape="2" offset="16093816" size="16" />
2131
+ <output>
2132
+ <port id="0" precision="I64">
2133
+ <dim>2</dim>
2134
+ </port>
2135
+ </output>
2136
+ </layer>
2137
+ <layer id="144" name="Constant_3880" type="Const" version="opset1">
2138
+ <data element_type="i64" shape="" offset="15894532" size="8" />
2139
+ <output>
2140
+ <port id="0" precision="I64" />
2141
+ </output>
2142
+ </layer>
2143
+ <layer id="145" name="Gather_3881" type="Gather" version="opset8">
2144
+ <data batch_dims="0" />
2145
+ <input>
2146
+ <port id="0" precision="I64">
2147
+ <dim>3</dim>
2148
+ </port>
2149
+ <port id="1" precision="I64">
2150
+ <dim>2</dim>
2151
+ </port>
2152
+ <port id="2" precision="I64" />
2153
+ </input>
2154
+ <output>
2155
+ <port id="3" precision="I64">
2156
+ <dim>2</dim>
2157
+ </port>
2158
+ </output>
2159
+ </layer>
2160
+ <layer id="146" name="__module.encoder.layer.1.attention.self/prim::ListConstruct/Concat_3" type="Concat" version="opset1">
2161
+ <data axis="0" />
2162
+ <input>
2163
+ <port id="0" precision="I64">
2164
+ <dim>2</dim>
2165
+ </port>
2166
+ <port id="1" precision="I64">
2167
+ <dim>1</dim>
2168
+ </port>
2169
+ </input>
2170
+ <output>
2171
+ <port id="2" precision="I64" names="218">
2172
+ <dim>3</dim>
2173
+ </port>
2174
+ </output>
2175
+ </layer>
2176
+ <layer id="147" name="__module.encoder.layer.1.attention.self/aten::reshape/Reshape" type="Reshape" version="opset1">
2177
+ <data special_zero="false" />
2178
+ <input>
2179
+ <port id="0" precision="FP32">
2180
+ <dim>-1</dim>
2181
+ <dim>-1</dim>
2182
+ <dim>2</dim>
2183
+ <dim>64</dim>
2184
+ </port>
2185
+ <port id="1" precision="I64">
2186
+ <dim>3</dim>
2187
+ </port>
2188
+ </input>
2189
+ <output>
2190
+ <port id="2" precision="FP32" names="219">
2191
+ <dim>-1</dim>
2192
+ <dim>-1</dim>
2193
+ <dim>128</dim>
2194
+ </port>
2195
+ </output>
2196
+ </layer>
2197
+ <layer id="148" name="self.encoder.layer.1.attention.output.dense.weight" type="Const" version="opset1">
2198
+ <data element_type="f32" shape="128, 128" offset="16886928" size="65536" />
2199
+ <output>
2200
+ <port id="0" precision="FP32" names="self.encoder.layer.1.attention.output.dense.weight">
2201
+ <dim>128</dim>
2202
+ <dim>128</dim>
2203
+ </port>
2204
+ </output>
2205
+ </layer>
2206
+ <layer id="149" name="__module.encoder.layer.1.attention.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2207
+ <data transpose_a="false" transpose_b="true" />
2208
+ <input>
2209
+ <port id="0" precision="FP32">
2210
+ <dim>-1</dim>
2211
+ <dim>-1</dim>
2212
+ <dim>128</dim>
2213
+ </port>
2214
+ <port id="1" precision="FP32">
2215
+ <dim>128</dim>
2216
+ <dim>128</dim>
2217
+ </port>
2218
+ </input>
2219
+ <output>
2220
+ <port id="2" precision="FP32">
2221
+ <dim>-1</dim>
2222
+ <dim>-1</dim>
2223
+ <dim>128</dim>
2224
+ </port>
2225
+ </output>
2226
+ </layer>
2227
+ <layer id="150" name="Constant_4077" type="Const" version="opset1">
2228
+ <data element_type="f32" shape="1, 1, 128" offset="16952464" size="512" />
2229
+ <output>
2230
+ <port id="0" precision="FP32">
2231
+ <dim>1</dim>
2232
+ <dim>1</dim>
2233
+ <dim>128</dim>
2234
+ </port>
2235
+ </output>
2236
+ </layer>
2237
+ <layer id="151" name="__module.encoder.layer.1.attention.output.dense/aten::linear/Add" type="Add" version="opset1">
2238
+ <data auto_broadcast="numpy" />
2239
+ <input>
2240
+ <port id="0" precision="FP32">
2241
+ <dim>-1</dim>
2242
+ <dim>-1</dim>
2243
+ <dim>128</dim>
2244
+ </port>
2245
+ <port id="1" precision="FP32">
2246
+ <dim>1</dim>
2247
+ <dim>1</dim>
2248
+ <dim>128</dim>
2249
+ </port>
2250
+ </input>
2251
+ <output>
2252
+ <port id="2" precision="FP32" names="225,input.7">
2253
+ <dim>-1</dim>
2254
+ <dim>-1</dim>
2255
+ <dim>128</dim>
2256
+ </port>
2257
+ </output>
2258
+ </layer>
2259
+ <layer id="152" name="__module.encoder.layer.1.attention.output/aten::add/Add" type="Add" version="opset1">
2260
+ <data auto_broadcast="numpy" />
2261
+ <input>
2262
+ <port id="0" precision="FP32">
2263
+ <dim>-1</dim>
2264
+ <dim>-1</dim>
2265
+ <dim>128</dim>
2266
+ </port>
2267
+ <port id="1" precision="FP32">
2268
+ <dim>-1</dim>
2269
+ <dim>-1</dim>
2270
+ <dim>128</dim>
2271
+ </port>
2272
+ </input>
2273
+ <output>
2274
+ <port id="2" precision="FP32" names="227">
2275
+ <dim>-1</dim>
2276
+ <dim>-1</dim>
2277
+ <dim>128</dim>
2278
+ </port>
2279
+ </output>
2280
+ </layer>
2281
+ <layer id="153" name="__module.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
2282
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
2283
+ <output>
2284
+ <port id="0" precision="I32">
2285
+ <dim>1</dim>
2286
+ </port>
2287
+ </output>
2288
+ </layer>
2289
+ <layer id="154" name="__module.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
2290
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
2291
+ <input>
2292
+ <port id="0" precision="FP32">
2293
+ <dim>-1</dim>
2294
+ <dim>-1</dim>
2295
+ <dim>128</dim>
2296
+ </port>
2297
+ <port id="1" precision="I32">
2298
+ <dim>1</dim>
2299
+ </port>
2300
+ </input>
2301
+ <output>
2302
+ <port id="2" precision="FP32">
2303
+ <dim>-1</dim>
2304
+ <dim>-1</dim>
2305
+ <dim>128</dim>
2306
+ </port>
2307
+ </output>
2308
+ </layer>
2309
+ <layer id="155" name="Constant_4078" type="Const" version="opset1">
2310
+ <data element_type="f32" shape="1, 1, 128" offset="16952976" size="512" />
2311
+ <output>
2312
+ <port id="0" precision="FP32">
2313
+ <dim>1</dim>
2314
+ <dim>1</dim>
2315
+ <dim>128</dim>
2316
+ </port>
2317
+ </output>
2318
+ </layer>
2319
+ <layer id="156" name="__module.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
2320
+ <data auto_broadcast="numpy" />
2321
+ <input>
2322
+ <port id="0" precision="FP32">
2323
+ <dim>-1</dim>
2324
+ <dim>-1</dim>
2325
+ <dim>128</dim>
2326
+ </port>
2327
+ <port id="1" precision="FP32">
2328
+ <dim>1</dim>
2329
+ <dim>1</dim>
2330
+ <dim>128</dim>
2331
+ </port>
2332
+ </input>
2333
+ <output>
2334
+ <port id="2" precision="FP32">
2335
+ <dim>-1</dim>
2336
+ <dim>-1</dim>
2337
+ <dim>128</dim>
2338
+ </port>
2339
+ </output>
2340
+ </layer>
2341
+ <layer id="157" name="Constant_4079" type="Const" version="opset1">
2342
+ <data element_type="f32" shape="1, 1, 128" offset="16953488" size="512" />
2343
+ <output>
2344
+ <port id="0" precision="FP32">
2345
+ <dim>1</dim>
2346
+ <dim>1</dim>
2347
+ <dim>128</dim>
2348
+ </port>
2349
+ </output>
2350
+ </layer>
2351
+ <layer id="158" name="__module.encoder.layer.1.attention.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
2352
+ <data auto_broadcast="numpy" />
2353
+ <input>
2354
+ <port id="0" precision="FP32">
2355
+ <dim>-1</dim>
2356
+ <dim>-1</dim>
2357
+ <dim>128</dim>
2358
+ </port>
2359
+ <port id="1" precision="FP32">
2360
+ <dim>1</dim>
2361
+ <dim>1</dim>
2362
+ <dim>128</dim>
2363
+ </port>
2364
+ </input>
2365
+ <output>
2366
+ <port id="2" precision="FP32" names="231,input_tensor">
2367
+ <dim>-1</dim>
2368
+ <dim>-1</dim>
2369
+ <dim>128</dim>
2370
+ </port>
2371
+ </output>
2372
+ </layer>
2373
+ <layer id="159" name="self.encoder.layer.1.intermediate.dense.weight" type="Const" version="opset1">
2374
+ <data element_type="f32" shape="512, 128" offset="16954000" size="262144" />
2375
+ <output>
2376
+ <port id="0" precision="FP32" names="self.encoder.layer.1.intermediate.dense.weight">
2377
+ <dim>512</dim>
2378
+ <dim>128</dim>
2379
+ </port>
2380
+ </output>
2381
+ </layer>
2382
+ <layer id="160" name="__module.encoder.layer.1.intermediate.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2383
+ <data transpose_a="false" transpose_b="true" />
2384
+ <input>
2385
+ <port id="0" precision="FP32">
2386
+ <dim>-1</dim>
2387
+ <dim>-1</dim>
2388
+ <dim>128</dim>
2389
+ </port>
2390
+ <port id="1" precision="FP32">
2391
+ <dim>512</dim>
2392
+ <dim>128</dim>
2393
+ </port>
2394
+ </input>
2395
+ <output>
2396
+ <port id="2" precision="FP32">
2397
+ <dim>-1</dim>
2398
+ <dim>-1</dim>
2399
+ <dim>512</dim>
2400
+ </port>
2401
+ </output>
2402
+ </layer>
2403
+ <layer id="161" name="Constant_4080" type="Const" version="opset1">
2404
+ <data element_type="f32" shape="1, 1, 512" offset="17216144" size="2048" />
2405
+ <output>
2406
+ <port id="0" precision="FP32">
2407
+ <dim>1</dim>
2408
+ <dim>1</dim>
2409
+ <dim>512</dim>
2410
+ </port>
2411
+ </output>
2412
+ </layer>
2413
+ <layer id="162" name="__module.encoder.layer.1.intermediate.dense/aten::linear/Add" type="Add" version="opset1">
2414
+ <data auto_broadcast="numpy" />
2415
+ <input>
2416
+ <port id="0" precision="FP32">
2417
+ <dim>-1</dim>
2418
+ <dim>-1</dim>
2419
+ <dim>512</dim>
2420
+ </port>
2421
+ <port id="1" precision="FP32">
2422
+ <dim>1</dim>
2423
+ <dim>1</dim>
2424
+ <dim>512</dim>
2425
+ </port>
2426
+ </input>
2427
+ <output>
2428
+ <port id="2" precision="FP32" names="236">
2429
+ <dim>-1</dim>
2430
+ <dim>-1</dim>
2431
+ <dim>512</dim>
2432
+ </port>
2433
+ </output>
2434
+ </layer>
2435
+ <layer id="163" name="__module.encoder.layer.1.intermediate.intermediate_act_fn/aten::gelu/Gelu" type="Gelu" version="opset7">
2436
+ <data approximation_mode="ERF" />
2437
+ <input>
2438
+ <port id="0" precision="FP32">
2439
+ <dim>-1</dim>
2440
+ <dim>-1</dim>
2441
+ <dim>512</dim>
2442
+ </port>
2443
+ </input>
2444
+ <output>
2445
+ <port id="1" precision="FP32" names="237">
2446
+ <dim>-1</dim>
2447
+ <dim>-1</dim>
2448
+ <dim>512</dim>
2449
+ </port>
2450
+ </output>
2451
+ </layer>
2452
+ <layer id="164" name="self.encoder.layer.1.output.dense.weight" type="Const" version="opset1">
2453
+ <data element_type="f32" shape="128, 512" offset="17218192" size="262144" />
2454
+ <output>
2455
+ <port id="0" precision="FP32" names="self.encoder.layer.1.output.dense.weight">
2456
+ <dim>128</dim>
2457
+ <dim>512</dim>
2458
+ </port>
2459
+ </output>
2460
+ </layer>
2461
+ <layer id="165" name="__module.encoder.layer.1.output.dense/aten::linear/MatMul" type="MatMul" version="opset1">
2462
+ <data transpose_a="false" transpose_b="true" />
2463
+ <input>
2464
+ <port id="0" precision="FP32">
2465
+ <dim>-1</dim>
2466
+ <dim>-1</dim>
2467
+ <dim>512</dim>
2468
+ </port>
2469
+ <port id="1" precision="FP32">
2470
+ <dim>128</dim>
2471
+ <dim>512</dim>
2472
+ </port>
2473
+ </input>
2474
+ <output>
2475
+ <port id="2" precision="FP32">
2476
+ <dim>-1</dim>
2477
+ <dim>-1</dim>
2478
+ <dim>128</dim>
2479
+ </port>
2480
+ </output>
2481
+ </layer>
2482
+ <layer id="166" name="Constant_4081" type="Const" version="opset1">
2483
+ <data element_type="f32" shape="1, 1, 128" offset="17480336" size="512" />
2484
+ <output>
2485
+ <port id="0" precision="FP32">
2486
+ <dim>1</dim>
2487
+ <dim>1</dim>
2488
+ <dim>128</dim>
2489
+ </port>
2490
+ </output>
2491
+ </layer>
2492
+ <layer id="167" name="__module.encoder.layer.1.output.dense/aten::linear/Add" type="Add" version="opset1">
2493
+ <data auto_broadcast="numpy" />
2494
+ <input>
2495
+ <port id="0" precision="FP32">
2496
+ <dim>-1</dim>
2497
+ <dim>-1</dim>
2498
+ <dim>128</dim>
2499
+ </port>
2500
+ <port id="1" precision="FP32">
2501
+ <dim>1</dim>
2502
+ <dim>1</dim>
2503
+ <dim>128</dim>
2504
+ </port>
2505
+ </input>
2506
+ <output>
2507
+ <port id="2" precision="FP32" names="243,input">
2508
+ <dim>-1</dim>
2509
+ <dim>-1</dim>
2510
+ <dim>128</dim>
2511
+ </port>
2512
+ </output>
2513
+ </layer>
2514
+ <layer id="168" name="__module.encoder.layer.1.output/aten::add/Add" type="Add" version="opset1">
2515
+ <data auto_broadcast="numpy" />
2516
+ <input>
2517
+ <port id="0" precision="FP32">
2518
+ <dim>-1</dim>
2519
+ <dim>-1</dim>
2520
+ <dim>128</dim>
2521
+ </port>
2522
+ <port id="1" precision="FP32">
2523
+ <dim>-1</dim>
2524
+ <dim>-1</dim>
2525
+ <dim>128</dim>
2526
+ </port>
2527
+ </input>
2528
+ <output>
2529
+ <port id="2" precision="FP32" names="245">
2530
+ <dim>-1</dim>
2531
+ <dim>-1</dim>
2532
+ <dim>128</dim>
2533
+ </port>
2534
+ </output>
2535
+ </layer>
2536
+ <layer id="169" name="__module.encoder.layer.1.output.LayerNorm/aten::layer_norm/Multiply" type="Const" version="opset1">
2537
+ <data element_type="i32" shape="1" offset="15894548" size="4" />
2538
+ <output>
2539
+ <port id="0" precision="I32">
2540
+ <dim>1</dim>
2541
+ </port>
2542
+ </output>
2543
+ </layer>
2544
+ <layer id="170" name="__module.encoder.layer.1.output.LayerNorm/aten::layer_norm/MVN" type="MVN" version="opset6">
2545
+ <data eps="9.999999960041972e-13" normalize_variance="true" eps_mode="INSIDE_SQRT" />
2546
+ <input>
2547
+ <port id="0" precision="FP32">
2548
+ <dim>-1</dim>
2549
+ <dim>-1</dim>
2550
+ <dim>128</dim>
2551
+ </port>
2552
+ <port id="1" precision="I32">
2553
+ <dim>1</dim>
2554
+ </port>
2555
+ </input>
2556
+ <output>
2557
+ <port id="2" precision="FP32">
2558
+ <dim>-1</dim>
2559
+ <dim>-1</dim>
2560
+ <dim>128</dim>
2561
+ </port>
2562
+ </output>
2563
+ </layer>
2564
+ <layer id="171" name="Constant_4082" type="Const" version="opset1">
2565
+ <data element_type="f32" shape="1, 1, 128" offset="17480848" size="512" />
2566
+ <output>
2567
+ <port id="0" precision="FP32">
2568
+ <dim>1</dim>
2569
+ <dim>1</dim>
2570
+ <dim>128</dim>
2571
+ </port>
2572
+ </output>
2573
+ </layer>
2574
+ <layer id="172" name="__module.encoder.layer.1.output.LayerNorm/aten::layer_norm/Multiply_1" type="Multiply" version="opset1">
2575
+ <data auto_broadcast="numpy" />
2576
+ <input>
2577
+ <port id="0" precision="FP32">
2578
+ <dim>-1</dim>
2579
+ <dim>-1</dim>
2580
+ <dim>128</dim>
2581
+ </port>
2582
+ <port id="1" precision="FP32">
2583
+ <dim>1</dim>
2584
+ <dim>1</dim>
2585
+ <dim>128</dim>
2586
+ </port>
2587
+ </input>
2588
+ <output>
2589
+ <port id="2" precision="FP32">
2590
+ <dim>-1</dim>
2591
+ <dim>-1</dim>
2592
+ <dim>128</dim>
2593
+ </port>
2594
+ </output>
2595
+ </layer>
2596
+ <layer id="173" name="Constant_4083" type="Const" version="opset1">
2597
+ <data element_type="f32" shape="1, 1, 128" offset="17481360" size="512" />
2598
+ <output>
2599
+ <port id="0" precision="FP32">
2600
+ <dim>1</dim>
2601
+ <dim>1</dim>
2602
+ <dim>128</dim>
2603
+ </port>
2604
+ </output>
2605
+ </layer>
2606
+ <layer id="174" name="__module.encoder.layer.1.output.LayerNorm/aten::layer_norm/Add" type="Add" version="opset1">
2607
+ <data auto_broadcast="numpy" />
2608
+ <input>
2609
+ <port id="0" precision="FP32">
2610
+ <dim>-1</dim>
2611
+ <dim>-1</dim>
2612
+ <dim>128</dim>
2613
+ </port>
2614
+ <port id="1" precision="FP32">
2615
+ <dim>1</dim>
2616
+ <dim>1</dim>
2617
+ <dim>128</dim>
2618
+ </port>
2619
+ </input>
2620
+ <output>
2621
+ <port id="2" precision="FP32" names="last_hidden_state">
2622
+ <dim>-1</dim>
2623
+ <dim>-1</dim>
2624
+ <dim>128</dim>
2625
+ </port>
2626
+ </output>
2627
+ </layer>
2628
+ <layer id="175" name="Result_1289" type="Result" version="opset1">
2629
+ <input>
2630
+ <port id="0" precision="FP32">
2631
+ <dim>-1</dim>
2632
+ <dim>-1</dim>
2633
+ <dim>128</dim>
2634
+ </port>
2635
+ </input>
2636
+ </layer>
2637
+ </layers>
2638
+ <edges>
2639
+ <edge from-layer="0" from-port="0" to-layer="8" to-port="0" />
2640
+ <edge from-layer="1" from-port="0" to-layer="58" to-port="0" />
2641
+ <edge from-layer="1" from-port="0" to-layer="61" to-port="0" />
2642
+ <edge from-layer="2" from-port="0" to-layer="4" to-port="0" />
2643
+ <edge from-layer="2" from-port="0" to-layer="15" to-port="0" />
2644
+ <edge from-layer="3" from-port="0" to-layer="6" to-port="0" />
2645
+ <edge from-layer="4" from-port="1" to-layer="6" to-port="1" />
2646
+ <edge from-layer="5" from-port="0" to-layer="6" to-port="2" />
2647
+ <edge from-layer="6" from-port="3" to-layer="11" to-port="0" />
2648
+ <edge from-layer="7" from-port="0" to-layer="10" to-port="0" />
2649
+ <edge from-layer="8" from-port="1" to-layer="10" to-port="1" />
2650
+ <edge from-layer="9" from-port="0" to-layer="10" to-port="2" />
2651
+ <edge from-layer="10" from-port="3" to-layer="11" to-port="1" />
2652
+ <edge from-layer="11" from-port="2" to-layer="25" to-port="0" />
2653
+ <edge from-layer="12" from-port="0" to-layer="24" to-port="0" />
2654
+ <edge from-layer="13" from-port="0" to-layer="21" to-port="0" />
2655
+ <edge from-layer="14" from-port="0" to-layer="21" to-port="1" />
2656
+ <edge from-layer="15" from-port="1" to-layer="18" to-port="0" />
2657
+ <edge from-layer="16" from-port="0" to-layer="18" to-port="1" />
2658
+ <edge from-layer="17" from-port="0" to-layer="18" to-port="2" />
2659
+ <edge from-layer="18" from-port="3" to-layer="21" to-port="2" />
2660
+ <edge from-layer="18" from-port="3" to-layer="69" to-port="2" />
2661
+ <edge from-layer="19" from-port="0" to-layer="21" to-port="3" />
2662
+ <edge from-layer="20" from-port="0" to-layer="21" to-port="4" />
2663
+ <edge from-layer="21" from-port="5" to-layer="22" to-port="0" />
2664
+ <edge from-layer="22" from-port="1" to-layer="24" to-port="1" />
2665
+ <edge from-layer="23" from-port="0" to-layer="24" to-port="2" />
2666
+ <edge from-layer="24" from-port="3" to-layer="25" to-port="1" />
2667
+ <edge from-layer="25" from-port="2" to-layer="27" to-port="0" />
2668
+ <edge from-layer="26" from-port="0" to-layer="27" to-port="1" />
2669
+ <edge from-layer="27" from-port="2" to-layer="29" to-port="0" />
2670
+ <edge from-layer="28" from-port="0" to-layer="29" to-port="1" />
2671
+ <edge from-layer="29" from-port="2" to-layer="31" to-port="0" />
2672
+ <edge from-layer="30" from-port="0" to-layer="31" to-port="1" />
2673
+ <edge from-layer="31" from-port="2" to-layer="41" to-port="0" />
2674
+ <edge from-layer="31" from-port="2" to-layer="92" to-port="1" />
2675
+ <edge from-layer="31" from-port="2" to-layer="49" to-port="0" />
2676
+ <edge from-layer="31" from-port="2" to-layer="33" to-port="0" />
2677
+ <edge from-layer="31" from-port="2" to-layer="81" to-port="0" />
2678
+ <edge from-layer="32" from-port="0" to-layer="33" to-port="1" />
2679
+ <edge from-layer="33" from-port="2" to-layer="35" to-port="0" />
2680
+ <edge from-layer="34" from-port="0" to-layer="35" to-port="1" />
2681
+ <edge from-layer="35" from-port="2" to-layer="37" to-port="0" />
2682
+ <edge from-layer="36" from-port="0" to-layer="37" to-port="1" />
2683
+ <edge from-layer="37" from-port="2" to-layer="39" to-port="0" />
2684
+ <edge from-layer="38" from-port="0" to-layer="39" to-port="1" />
2685
+ <edge from-layer="39" from-port="2" to-layer="78" to-port="0" />
2686
+ <edge from-layer="40" from-port="0" to-layer="41" to-port="1" />
2687
+ <edge from-layer="41" from-port="2" to-layer="43" to-port="0" />
2688
+ <edge from-layer="42" from-port="0" to-layer="43" to-port="1" />
2689
+ <edge from-layer="43" from-port="2" to-layer="45" to-port="0" />
2690
+ <edge from-layer="44" from-port="0" to-layer="45" to-port="1" />
2691
+ <edge from-layer="45" from-port="2" to-layer="47" to-port="0" />
2692
+ <edge from-layer="46" from-port="0" to-layer="47" to-port="1" />
2693
+ <edge from-layer="47" from-port="2" to-layer="78" to-port="1" />
2694
+ <edge from-layer="48" from-port="0" to-layer="49" to-port="1" />
2695
+ <edge from-layer="49" from-port="2" to-layer="51" to-port="0" />
2696
+ <edge from-layer="50" from-port="0" to-layer="51" to-port="1" />
2697
+ <edge from-layer="51" from-port="2" to-layer="53" to-port="0" />
2698
+ <edge from-layer="52" from-port="0" to-layer="53" to-port="1" />
2699
+ <edge from-layer="53" from-port="2" to-layer="55" to-port="0" />
2700
+ <edge from-layer="54" from-port="0" to-layer="55" to-port="1" />
2701
+ <edge from-layer="55" from-port="2" to-layer="78" to-port="2" />
2702
+ <edge from-layer="56" from-port="0" to-layer="74" to-port="0" />
2703
+ <edge from-layer="57" from-port="0" to-layer="58" to-port="1" />
2704
+ <edge from-layer="58" from-port="2" to-layer="60" to-port="0" />
2705
+ <edge from-layer="59" from-port="0" to-layer="60" to-port="1" />
2706
+ <edge from-layer="60" from-port="2" to-layer="70" to-port="0" />
2707
+ <edge from-layer="61" from-port="1" to-layer="64" to-port="0" />
2708
+ <edge from-layer="61" from-port="1" to-layer="68" to-port="0" />
2709
+ <edge from-layer="62" from-port="0" to-layer="64" to-port="1" />
2710
+ <edge from-layer="63" from-port="0" to-layer="64" to-port="2" />
2711
+ <edge from-layer="64" from-port="3" to-layer="69" to-port="0" />
2712
+ <edge from-layer="65" from-port="0" to-layer="69" to-port="1" />
2713
+ <edge from-layer="66" from-port="0" to-layer="68" to-port="1" />
2714
+ <edge from-layer="67" from-port="0" to-layer="68" to-port="2" />
2715
+ <edge from-layer="68" from-port="3" to-layer="69" to-port="3" />
2716
+ <edge from-layer="69" from-port="4" to-layer="70" to-port="1" />
2717
+ <edge from-layer="70" from-port="2" to-layer="71" to-port="0" />
2718
+ <edge from-layer="71" from-port="1" to-layer="73" to-port="0" />
2719
+ <edge from-layer="72" from-port="0" to-layer="73" to-port="1" />
2720
+ <edge from-layer="73" from-port="2" to-layer="74" to-port="1" />
2721
+ <edge from-layer="74" from-port="2" to-layer="77" to-port="2" />
2722
+ <edge from-layer="74" from-port="2" to-layer="75" to-port="0" />
2723
+ <edge from-layer="75" from-port="1" to-layer="77" to-port="0" />
2724
+ <edge from-layer="76" from-port="0" to-layer="77" to-port="1" />
2725
+ <edge from-layer="77" from-port="3" to-layer="78" to-port="3" />
2726
+ <edge from-layer="77" from-port="3" to-layer="139" to-port="3" />
2727
+ <edge from-layer="78" from-port="4" to-layer="80" to-port="0" />
2728
+ <edge from-layer="79" from-port="0" to-layer="80" to-port="1" />
2729
+ <edge from-layer="80" from-port="2" to-layer="87" to-port="0" />
2730
+ <edge from-layer="81" from-port="1" to-layer="84" to-port="0" />
2731
+ <edge from-layer="82" from-port="0" to-layer="84" to-port="1" />
2732
+ <edge from-layer="83" from-port="0" to-layer="84" to-port="2" />
2733
+ <edge from-layer="84" from-port="3" to-layer="86" to-port="0" />
2734
+ <edge from-layer="85" from-port="0" to-layer="86" to-port="1" />
2735
+ <edge from-layer="85" from-port="0" to-layer="146" to-port="1" />
2736
+ <edge from-layer="86" from-port="2" to-layer="87" to-port="1" />
2737
+ <edge from-layer="87" from-port="2" to-layer="89" to-port="0" />
2738
+ <edge from-layer="88" from-port="0" to-layer="89" to-port="1" />
2739
+ <edge from-layer="89" from-port="2" to-layer="91" to-port="0" />
2740
+ <edge from-layer="90" from-port="0" to-layer="91" to-port="1" />
2741
+ <edge from-layer="91" from-port="2" to-layer="92" to-port="0" />
2742
+ <edge from-layer="92" from-port="2" to-layer="94" to-port="0" />
2743
+ <edge from-layer="93" from-port="0" to-layer="94" to-port="1" />
2744
+ <edge from-layer="94" from-port="2" to-layer="96" to-port="0" />
2745
+ <edge from-layer="95" from-port="0" to-layer="96" to-port="1" />
2746
+ <edge from-layer="96" from-port="2" to-layer="98" to-port="0" />
2747
+ <edge from-layer="97" from-port="0" to-layer="98" to-port="1" />
2748
+ <edge from-layer="98" from-port="2" to-layer="100" to-port="0" />
2749
+ <edge from-layer="98" from-port="2" to-layer="108" to-port="1" />
2750
+ <edge from-layer="99" from-port="0" to-layer="100" to-port="1" />
2751
+ <edge from-layer="100" from-port="2" to-layer="102" to-port="0" />
2752
+ <edge from-layer="101" from-port="0" to-layer="102" to-port="1" />
2753
+ <edge from-layer="102" from-port="2" to-layer="103" to-port="0" />
2754
+ <edge from-layer="103" from-port="1" to-layer="105" to-port="0" />
2755
+ <edge from-layer="104" from-port="0" to-layer="105" to-port="1" />
2756
+ <edge from-layer="105" from-port="2" to-layer="107" to-port="0" />
2757
+ <edge from-layer="106" from-port="0" to-layer="107" to-port="1" />
2758
+ <edge from-layer="107" from-port="2" to-layer="108" to-port="0" />
2759
+ <edge from-layer="108" from-port="2" to-layer="110" to-port="0" />
2760
+ <edge from-layer="109" from-port="0" to-layer="110" to-port="1" />
2761
+ <edge from-layer="110" from-port="2" to-layer="112" to-port="0" />
2762
+ <edge from-layer="111" from-port="0" to-layer="112" to-port="1" />
2763
+ <edge from-layer="112" from-port="2" to-layer="114" to-port="0" />
2764
+ <edge from-layer="113" from-port="0" to-layer="114" to-port="1" />
2765
+ <edge from-layer="114" from-port="2" to-layer="124" to-port="0" />
2766
+ <edge from-layer="114" from-port="2" to-layer="132" to-port="0" />
2767
+ <edge from-layer="114" from-port="2" to-layer="152" to-port="1" />
2768
+ <edge from-layer="114" from-port="2" to-layer="142" to-port="0" />
2769
+ <edge from-layer="114" from-port="2" to-layer="116" to-port="0" />
2770
+ <edge from-layer="115" from-port="0" to-layer="116" to-port="1" />
2771
+ <edge from-layer="116" from-port="2" to-layer="118" to-port="0" />
2772
+ <edge from-layer="117" from-port="0" to-layer="118" to-port="1" />
2773
+ <edge from-layer="118" from-port="2" to-layer="120" to-port="0" />
2774
+ <edge from-layer="119" from-port="0" to-layer="120" to-port="1" />
2775
+ <edge from-layer="120" from-port="2" to-layer="122" to-port="0" />
2776
+ <edge from-layer="121" from-port="0" to-layer="122" to-port="1" />
2777
+ <edge from-layer="122" from-port="2" to-layer="139" to-port="0" />
2778
+ <edge from-layer="123" from-port="0" to-layer="124" to-port="1" />
2779
+ <edge from-layer="124" from-port="2" to-layer="126" to-port="0" />
2780
+ <edge from-layer="125" from-port="0" to-layer="126" to-port="1" />
2781
+ <edge from-layer="126" from-port="2" to-layer="128" to-port="0" />
2782
+ <edge from-layer="127" from-port="0" to-layer="128" to-port="1" />
2783
+ <edge from-layer="128" from-port="2" to-layer="130" to-port="0" />
2784
+ <edge from-layer="129" from-port="0" to-layer="130" to-port="1" />
2785
+ <edge from-layer="130" from-port="2" to-layer="139" to-port="1" />
2786
+ <edge from-layer="131" from-port="0" to-layer="132" to-port="1" />
2787
+ <edge from-layer="132" from-port="2" to-layer="134" to-port="0" />
2788
+ <edge from-layer="133" from-port="0" to-layer="134" to-port="1" />
2789
+ <edge from-layer="134" from-port="2" to-layer="136" to-port="0" />
2790
+ <edge from-layer="135" from-port="0" to-layer="136" to-port="1" />
2791
+ <edge from-layer="136" from-port="2" to-layer="138" to-port="0" />
2792
+ <edge from-layer="137" from-port="0" to-layer="138" to-port="1" />
2793
+ <edge from-layer="138" from-port="2" to-layer="139" to-port="2" />
2794
+ <edge from-layer="139" from-port="4" to-layer="141" to-port="0" />
2795
+ <edge from-layer="140" from-port="0" to-layer="141" to-port="1" />
2796
+ <edge from-layer="141" from-port="2" to-layer="147" to-port="0" />
2797
+ <edge from-layer="142" from-port="1" to-layer="145" to-port="0" />
2798
+ <edge from-layer="143" from-port="0" to-layer="145" to-port="1" />
2799
+ <edge from-layer="144" from-port="0" to-layer="145" to-port="2" />
2800
+ <edge from-layer="145" from-port="3" to-layer="146" to-port="0" />
2801
+ <edge from-layer="146" from-port="2" to-layer="147" to-port="1" />
2802
+ <edge from-layer="147" from-port="2" to-layer="149" to-port="0" />
2803
+ <edge from-layer="148" from-port="0" to-layer="149" to-port="1" />
2804
+ <edge from-layer="149" from-port="2" to-layer="151" to-port="0" />
2805
+ <edge from-layer="150" from-port="0" to-layer="151" to-port="1" />
2806
+ <edge from-layer="151" from-port="2" to-layer="152" to-port="0" />
2807
+ <edge from-layer="152" from-port="2" to-layer="154" to-port="0" />
2808
+ <edge from-layer="153" from-port="0" to-layer="154" to-port="1" />
2809
+ <edge from-layer="154" from-port="2" to-layer="156" to-port="0" />
2810
+ <edge from-layer="155" from-port="0" to-layer="156" to-port="1" />
2811
+ <edge from-layer="156" from-port="2" to-layer="158" to-port="0" />
2812
+ <edge from-layer="157" from-port="0" to-layer="158" to-port="1" />
2813
+ <edge from-layer="158" from-port="2" to-layer="160" to-port="0" />
2814
+ <edge from-layer="158" from-port="2" to-layer="168" to-port="1" />
2815
+ <edge from-layer="159" from-port="0" to-layer="160" to-port="1" />
2816
+ <edge from-layer="160" from-port="2" to-layer="162" to-port="0" />
2817
+ <edge from-layer="161" from-port="0" to-layer="162" to-port="1" />
2818
+ <edge from-layer="162" from-port="2" to-layer="163" to-port="0" />
2819
+ <edge from-layer="163" from-port="1" to-layer="165" to-port="0" />
2820
+ <edge from-layer="164" from-port="0" to-layer="165" to-port="1" />
2821
+ <edge from-layer="165" from-port="2" to-layer="167" to-port="0" />
2822
+ <edge from-layer="166" from-port="0" to-layer="167" to-port="1" />
2823
+ <edge from-layer="167" from-port="2" to-layer="168" to-port="0" />
2824
+ <edge from-layer="168" from-port="2" to-layer="170" to-port="0" />
2825
+ <edge from-layer="169" from-port="0" to-layer="170" to-port="1" />
2826
+ <edge from-layer="170" from-port="2" to-layer="172" to-port="0" />
2827
+ <edge from-layer="171" from-port="0" to-layer="172" to-port="1" />
2828
+ <edge from-layer="172" from-port="2" to-layer="174" to-port="0" />
2829
+ <edge from-layer="173" from-port="0" to-layer="174" to-port="1" />
2830
+ <edge from-layer="174" from-port="2" to-layer="175" to-port="0" />
2831
+ </edges>
2832
+ <rt_info>
2833
+ <Runtime_version value="2024.3.0-16041-1e3b88e4e3f-releases/2024/3" />
2834
+ <conversion_parameters>
2835
+ <framework value="pytorch" />
2836
+ <is_python_object value="True" />
2837
+ </conversion_parameters>
2838
+ <optimum>
2839
+ <optimum_intel_version value="1.19.0" />
2840
+ <optimum_version value="1.22.0" />
2841
+ <pytorch_version value="2.5.0.dev20240807+cu121" />
2842
+ <transformers_version value="4.43.4" />
2843
+ </optimum>
2844
+ </rt_info>
2845
+ </net>