learn3r commited on
Commit
ee83c29
·
1 Parent(s): e276ba4

End of training

Browse files
Files changed (5) hide show
  1. README.md +4 -2
  2. all_results.json +13 -0
  3. eval_results.json +8 -0
  4. train_results.json +8 -0
  5. trainer_state.json +2157 -0
README.md CHANGED
@@ -3,6 +3,8 @@ license: apache-2.0
3
  base_model: facebook/bart-base
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: bart_base_qgen
8
  results: []
@@ -13,9 +15,9 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # bart_base_qgen
15
 
16
- This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 1.4219
19
 
20
  ## Model description
21
 
 
3
  base_model: facebook/bart-base
4
  tags:
5
  - generated_from_trainer
6
+ datasets:
7
+ - learn3r/squad_with_test
8
  model-index:
9
  - name: bart_base_qgen
10
  results: []
 
15
 
16
  # bart_base_qgen
17
 
18
+ This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the learn3r/squad_with_test dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.4211
21
 
22
  ## Model description
23
 
all_results.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.99,
3
+ "eval_loss": 1.421057105064392,
4
+ "eval_runtime": 54.5355,
5
+ "eval_samples": 5285,
6
+ "eval_samples_per_second": 96.909,
7
+ "eval_steps_per_second": 3.044,
8
+ "train_loss": 1.5164859166619373,
9
+ "train_runtime": 422176.3551,
10
+ "train_samples": 87599,
11
+ "train_samples_per_second": 2.075,
12
+ "train_steps_per_second": 0.004
13
+ }
eval_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.99,
3
+ "eval_loss": 1.421057105064392,
4
+ "eval_runtime": 54.5355,
5
+ "eval_samples": 5285,
6
+ "eval_samples_per_second": 96.909,
7
+ "eval_steps_per_second": 3.044
8
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.99,
3
+ "train_loss": 1.5164859166619373,
4
+ "train_runtime": 422176.3551,
5
+ "train_samples": 87599,
6
+ "train_samples_per_second": 2.075,
7
+ "train_steps_per_second": 0.004
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,2157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 1.421057105064392,
3
+ "best_model_checkpoint": "/home/s1970716/models/bart_base_qgen/checkpoint-1369",
4
+ "epoch": 9.992695398100803,
5
+ "global_step": 1710,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 0.03,
12
+ "learning_rate": 1.994152046783626e-05,
13
+ "loss": 3.2937,
14
+ "step": 5
15
+ },
16
+ {
17
+ "epoch": 0.06,
18
+ "learning_rate": 1.9883040935672515e-05,
19
+ "loss": 2.4156,
20
+ "step": 10
21
+ },
22
+ {
23
+ "epoch": 0.09,
24
+ "learning_rate": 1.9824561403508773e-05,
25
+ "loss": 2.199,
26
+ "step": 15
27
+ },
28
+ {
29
+ "epoch": 0.12,
30
+ "learning_rate": 1.976608187134503e-05,
31
+ "loss": 2.0742,
32
+ "step": 20
33
+ },
34
+ {
35
+ "epoch": 0.15,
36
+ "learning_rate": 1.970760233918129e-05,
37
+ "loss": 2.0309,
38
+ "step": 25
39
+ },
40
+ {
41
+ "epoch": 0.18,
42
+ "learning_rate": 1.9649122807017544e-05,
43
+ "loss": 1.9927,
44
+ "step": 30
45
+ },
46
+ {
47
+ "epoch": 0.2,
48
+ "learning_rate": 1.9590643274853802e-05,
49
+ "loss": 1.9811,
50
+ "step": 35
51
+ },
52
+ {
53
+ "epoch": 0.23,
54
+ "learning_rate": 1.953216374269006e-05,
55
+ "loss": 1.9124,
56
+ "step": 40
57
+ },
58
+ {
59
+ "epoch": 0.26,
60
+ "learning_rate": 1.9473684210526318e-05,
61
+ "loss": 1.8912,
62
+ "step": 45
63
+ },
64
+ {
65
+ "epoch": 0.29,
66
+ "learning_rate": 1.9415204678362573e-05,
67
+ "loss": 1.92,
68
+ "step": 50
69
+ },
70
+ {
71
+ "epoch": 0.32,
72
+ "learning_rate": 1.935672514619883e-05,
73
+ "loss": 1.8673,
74
+ "step": 55
75
+ },
76
+ {
77
+ "epoch": 0.35,
78
+ "learning_rate": 1.929824561403509e-05,
79
+ "loss": 1.8451,
80
+ "step": 60
81
+ },
82
+ {
83
+ "epoch": 0.38,
84
+ "learning_rate": 1.9239766081871347e-05,
85
+ "loss": 1.7833,
86
+ "step": 65
87
+ },
88
+ {
89
+ "epoch": 0.41,
90
+ "learning_rate": 1.9181286549707602e-05,
91
+ "loss": 1.7887,
92
+ "step": 70
93
+ },
94
+ {
95
+ "epoch": 0.44,
96
+ "learning_rate": 1.912280701754386e-05,
97
+ "loss": 1.8294,
98
+ "step": 75
99
+ },
100
+ {
101
+ "epoch": 0.47,
102
+ "learning_rate": 1.9064327485380118e-05,
103
+ "loss": 1.8207,
104
+ "step": 80
105
+ },
106
+ {
107
+ "epoch": 0.5,
108
+ "learning_rate": 1.9005847953216376e-05,
109
+ "loss": 1.8147,
110
+ "step": 85
111
+ },
112
+ {
113
+ "epoch": 0.53,
114
+ "learning_rate": 1.894736842105263e-05,
115
+ "loss": 1.8262,
116
+ "step": 90
117
+ },
118
+ {
119
+ "epoch": 0.56,
120
+ "learning_rate": 1.888888888888889e-05,
121
+ "loss": 1.8197,
122
+ "step": 95
123
+ },
124
+ {
125
+ "epoch": 0.58,
126
+ "learning_rate": 1.8830409356725147e-05,
127
+ "loss": 1.7758,
128
+ "step": 100
129
+ },
130
+ {
131
+ "epoch": 0.61,
132
+ "learning_rate": 1.8771929824561405e-05,
133
+ "loss": 1.7745,
134
+ "step": 105
135
+ },
136
+ {
137
+ "epoch": 0.64,
138
+ "learning_rate": 1.871345029239766e-05,
139
+ "loss": 1.7955,
140
+ "step": 110
141
+ },
142
+ {
143
+ "epoch": 0.67,
144
+ "learning_rate": 1.8654970760233918e-05,
145
+ "loss": 1.7856,
146
+ "step": 115
147
+ },
148
+ {
149
+ "epoch": 0.7,
150
+ "learning_rate": 1.8596491228070176e-05,
151
+ "loss": 1.7584,
152
+ "step": 120
153
+ },
154
+ {
155
+ "epoch": 0.73,
156
+ "learning_rate": 1.8538011695906434e-05,
157
+ "loss": 1.7864,
158
+ "step": 125
159
+ },
160
+ {
161
+ "epoch": 0.76,
162
+ "learning_rate": 1.847953216374269e-05,
163
+ "loss": 1.7483,
164
+ "step": 130
165
+ },
166
+ {
167
+ "epoch": 0.79,
168
+ "learning_rate": 1.8421052631578947e-05,
169
+ "loss": 1.7431,
170
+ "step": 135
171
+ },
172
+ {
173
+ "epoch": 0.82,
174
+ "learning_rate": 1.8362573099415205e-05,
175
+ "loss": 1.7358,
176
+ "step": 140
177
+ },
178
+ {
179
+ "epoch": 0.85,
180
+ "learning_rate": 1.8304093567251464e-05,
181
+ "loss": 1.7497,
182
+ "step": 145
183
+ },
184
+ {
185
+ "epoch": 0.88,
186
+ "learning_rate": 1.824561403508772e-05,
187
+ "loss": 1.7249,
188
+ "step": 150
189
+ },
190
+ {
191
+ "epoch": 0.91,
192
+ "learning_rate": 1.8187134502923976e-05,
193
+ "loss": 1.7271,
194
+ "step": 155
195
+ },
196
+ {
197
+ "epoch": 0.93,
198
+ "learning_rate": 1.8128654970760235e-05,
199
+ "loss": 1.6804,
200
+ "step": 160
201
+ },
202
+ {
203
+ "epoch": 0.96,
204
+ "learning_rate": 1.8070175438596493e-05,
205
+ "loss": 1.7269,
206
+ "step": 165
207
+ },
208
+ {
209
+ "epoch": 0.99,
210
+ "learning_rate": 1.8011695906432747e-05,
211
+ "loss": 1.7237,
212
+ "step": 170
213
+ },
214
+ {
215
+ "epoch": 1.0,
216
+ "eval_loss": 1.5064257383346558,
217
+ "eval_runtime": 54.6397,
218
+ "eval_samples_per_second": 96.724,
219
+ "eval_steps_per_second": 3.038,
220
+ "step": 171
221
+ },
222
+ {
223
+ "epoch": 1.02,
224
+ "learning_rate": 1.7953216374269006e-05,
225
+ "loss": 1.6854,
226
+ "step": 175
227
+ },
228
+ {
229
+ "epoch": 1.05,
230
+ "learning_rate": 1.7894736842105264e-05,
231
+ "loss": 1.663,
232
+ "step": 180
233
+ },
234
+ {
235
+ "epoch": 1.08,
236
+ "learning_rate": 1.7836257309941522e-05,
237
+ "loss": 1.6874,
238
+ "step": 185
239
+ },
240
+ {
241
+ "epoch": 1.11,
242
+ "learning_rate": 1.7777777777777777e-05,
243
+ "loss": 1.6968,
244
+ "step": 190
245
+ },
246
+ {
247
+ "epoch": 1.14,
248
+ "learning_rate": 1.7719298245614035e-05,
249
+ "loss": 1.672,
250
+ "step": 195
251
+ },
252
+ {
253
+ "epoch": 1.17,
254
+ "learning_rate": 1.7660818713450293e-05,
255
+ "loss": 1.6925,
256
+ "step": 200
257
+ },
258
+ {
259
+ "epoch": 1.2,
260
+ "learning_rate": 1.760233918128655e-05,
261
+ "loss": 1.6568,
262
+ "step": 205
263
+ },
264
+ {
265
+ "epoch": 1.23,
266
+ "learning_rate": 1.754385964912281e-05,
267
+ "loss": 1.6246,
268
+ "step": 210
269
+ },
270
+ {
271
+ "epoch": 1.26,
272
+ "learning_rate": 1.7485380116959064e-05,
273
+ "loss": 1.6766,
274
+ "step": 215
275
+ },
276
+ {
277
+ "epoch": 1.29,
278
+ "learning_rate": 1.7426900584795322e-05,
279
+ "loss": 1.6567,
280
+ "step": 220
281
+ },
282
+ {
283
+ "epoch": 1.31,
284
+ "learning_rate": 1.736842105263158e-05,
285
+ "loss": 1.6632,
286
+ "step": 225
287
+ },
288
+ {
289
+ "epoch": 1.34,
290
+ "learning_rate": 1.7309941520467838e-05,
291
+ "loss": 1.6726,
292
+ "step": 230
293
+ },
294
+ {
295
+ "epoch": 1.37,
296
+ "learning_rate": 1.7251461988304093e-05,
297
+ "loss": 1.6241,
298
+ "step": 235
299
+ },
300
+ {
301
+ "epoch": 1.4,
302
+ "learning_rate": 1.719298245614035e-05,
303
+ "loss": 1.6584,
304
+ "step": 240
305
+ },
306
+ {
307
+ "epoch": 1.43,
308
+ "learning_rate": 1.713450292397661e-05,
309
+ "loss": 1.6508,
310
+ "step": 245
311
+ },
312
+ {
313
+ "epoch": 1.46,
314
+ "learning_rate": 1.7076023391812867e-05,
315
+ "loss": 1.6397,
316
+ "step": 250
317
+ },
318
+ {
319
+ "epoch": 1.49,
320
+ "learning_rate": 1.7017543859649125e-05,
321
+ "loss": 1.6378,
322
+ "step": 255
323
+ },
324
+ {
325
+ "epoch": 1.52,
326
+ "learning_rate": 1.695906432748538e-05,
327
+ "loss": 1.6827,
328
+ "step": 260
329
+ },
330
+ {
331
+ "epoch": 1.55,
332
+ "learning_rate": 1.690058479532164e-05,
333
+ "loss": 1.658,
334
+ "step": 265
335
+ },
336
+ {
337
+ "epoch": 1.58,
338
+ "learning_rate": 1.6842105263157896e-05,
339
+ "loss": 1.6236,
340
+ "step": 270
341
+ },
342
+ {
343
+ "epoch": 1.61,
344
+ "learning_rate": 1.6783625730994155e-05,
345
+ "loss": 1.6602,
346
+ "step": 275
347
+ },
348
+ {
349
+ "epoch": 1.64,
350
+ "learning_rate": 1.672514619883041e-05,
351
+ "loss": 1.6419,
352
+ "step": 280
353
+ },
354
+ {
355
+ "epoch": 1.67,
356
+ "learning_rate": 1.6666666666666667e-05,
357
+ "loss": 1.6307,
358
+ "step": 285
359
+ },
360
+ {
361
+ "epoch": 1.69,
362
+ "learning_rate": 1.6608187134502926e-05,
363
+ "loss": 1.6179,
364
+ "step": 290
365
+ },
366
+ {
367
+ "epoch": 1.72,
368
+ "learning_rate": 1.6549707602339184e-05,
369
+ "loss": 1.6396,
370
+ "step": 295
371
+ },
372
+ {
373
+ "epoch": 1.75,
374
+ "learning_rate": 1.649122807017544e-05,
375
+ "loss": 1.6342,
376
+ "step": 300
377
+ },
378
+ {
379
+ "epoch": 1.78,
380
+ "learning_rate": 1.6432748538011697e-05,
381
+ "loss": 1.6414,
382
+ "step": 305
383
+ },
384
+ {
385
+ "epoch": 1.81,
386
+ "learning_rate": 1.6374269005847955e-05,
387
+ "loss": 1.6393,
388
+ "step": 310
389
+ },
390
+ {
391
+ "epoch": 1.84,
392
+ "learning_rate": 1.6315789473684213e-05,
393
+ "loss": 1.6478,
394
+ "step": 315
395
+ },
396
+ {
397
+ "epoch": 1.87,
398
+ "learning_rate": 1.625730994152047e-05,
399
+ "loss": 1.6384,
400
+ "step": 320
401
+ },
402
+ {
403
+ "epoch": 1.9,
404
+ "learning_rate": 1.6198830409356726e-05,
405
+ "loss": 1.61,
406
+ "step": 325
407
+ },
408
+ {
409
+ "epoch": 1.93,
410
+ "learning_rate": 1.6140350877192984e-05,
411
+ "loss": 1.6028,
412
+ "step": 330
413
+ },
414
+ {
415
+ "epoch": 1.96,
416
+ "learning_rate": 1.6081871345029242e-05,
417
+ "loss": 1.5761,
418
+ "step": 335
419
+ },
420
+ {
421
+ "epoch": 1.99,
422
+ "learning_rate": 1.60233918128655e-05,
423
+ "loss": 1.5995,
424
+ "step": 340
425
+ },
426
+ {
427
+ "epoch": 2.0,
428
+ "eval_loss": 1.4697957038879395,
429
+ "eval_runtime": 54.6066,
430
+ "eval_samples_per_second": 96.783,
431
+ "eval_steps_per_second": 3.04,
432
+ "step": 342
433
+ },
434
+ {
435
+ "epoch": 2.02,
436
+ "learning_rate": 1.5964912280701755e-05,
437
+ "loss": 1.6004,
438
+ "step": 345
439
+ },
440
+ {
441
+ "epoch": 2.05,
442
+ "learning_rate": 1.5906432748538013e-05,
443
+ "loss": 1.5699,
444
+ "step": 350
445
+ },
446
+ {
447
+ "epoch": 2.07,
448
+ "learning_rate": 1.584795321637427e-05,
449
+ "loss": 1.5738,
450
+ "step": 355
451
+ },
452
+ {
453
+ "epoch": 2.1,
454
+ "learning_rate": 1.578947368421053e-05,
455
+ "loss": 1.5974,
456
+ "step": 360
457
+ },
458
+ {
459
+ "epoch": 2.13,
460
+ "learning_rate": 1.5730994152046787e-05,
461
+ "loss": 1.5888,
462
+ "step": 365
463
+ },
464
+ {
465
+ "epoch": 2.16,
466
+ "learning_rate": 1.5672514619883042e-05,
467
+ "loss": 1.5881,
468
+ "step": 370
469
+ },
470
+ {
471
+ "epoch": 2.19,
472
+ "learning_rate": 1.56140350877193e-05,
473
+ "loss": 1.5908,
474
+ "step": 375
475
+ },
476
+ {
477
+ "epoch": 2.22,
478
+ "learning_rate": 1.555555555555556e-05,
479
+ "loss": 1.5866,
480
+ "step": 380
481
+ },
482
+ {
483
+ "epoch": 2.25,
484
+ "learning_rate": 1.5497076023391816e-05,
485
+ "loss": 1.5688,
486
+ "step": 385
487
+ },
488
+ {
489
+ "epoch": 2.28,
490
+ "learning_rate": 1.543859649122807e-05,
491
+ "loss": 1.5718,
492
+ "step": 390
493
+ },
494
+ {
495
+ "epoch": 2.31,
496
+ "learning_rate": 1.538011695906433e-05,
497
+ "loss": 1.5862,
498
+ "step": 395
499
+ },
500
+ {
501
+ "epoch": 2.34,
502
+ "learning_rate": 1.5321637426900587e-05,
503
+ "loss": 1.5568,
504
+ "step": 400
505
+ },
506
+ {
507
+ "epoch": 2.37,
508
+ "learning_rate": 1.5263157894736846e-05,
509
+ "loss": 1.5796,
510
+ "step": 405
511
+ },
512
+ {
513
+ "epoch": 2.4,
514
+ "learning_rate": 1.52046783625731e-05,
515
+ "loss": 1.5615,
516
+ "step": 410
517
+ },
518
+ {
519
+ "epoch": 2.43,
520
+ "learning_rate": 1.5146198830409358e-05,
521
+ "loss": 1.5674,
522
+ "step": 415
523
+ },
524
+ {
525
+ "epoch": 2.45,
526
+ "learning_rate": 1.5087719298245615e-05,
527
+ "loss": 1.5974,
528
+ "step": 420
529
+ },
530
+ {
531
+ "epoch": 2.48,
532
+ "learning_rate": 1.5029239766081873e-05,
533
+ "loss": 1.552,
534
+ "step": 425
535
+ },
536
+ {
537
+ "epoch": 2.51,
538
+ "learning_rate": 1.497076023391813e-05,
539
+ "loss": 1.5843,
540
+ "step": 430
541
+ },
542
+ {
543
+ "epoch": 2.54,
544
+ "learning_rate": 1.4912280701754388e-05,
545
+ "loss": 1.5495,
546
+ "step": 435
547
+ },
548
+ {
549
+ "epoch": 2.57,
550
+ "learning_rate": 1.4853801169590644e-05,
551
+ "loss": 1.576,
552
+ "step": 440
553
+ },
554
+ {
555
+ "epoch": 2.6,
556
+ "learning_rate": 1.4795321637426902e-05,
557
+ "loss": 1.5556,
558
+ "step": 445
559
+ },
560
+ {
561
+ "epoch": 2.63,
562
+ "learning_rate": 1.4736842105263159e-05,
563
+ "loss": 1.5451,
564
+ "step": 450
565
+ },
566
+ {
567
+ "epoch": 2.66,
568
+ "learning_rate": 1.4678362573099417e-05,
569
+ "loss": 1.5697,
570
+ "step": 455
571
+ },
572
+ {
573
+ "epoch": 2.69,
574
+ "learning_rate": 1.4619883040935675e-05,
575
+ "loss": 1.5698,
576
+ "step": 460
577
+ },
578
+ {
579
+ "epoch": 2.72,
580
+ "learning_rate": 1.4561403508771931e-05,
581
+ "loss": 1.571,
582
+ "step": 465
583
+ },
584
+ {
585
+ "epoch": 2.75,
586
+ "learning_rate": 1.4502923976608188e-05,
587
+ "loss": 1.5633,
588
+ "step": 470
589
+ },
590
+ {
591
+ "epoch": 2.78,
592
+ "learning_rate": 1.4444444444444446e-05,
593
+ "loss": 1.5628,
594
+ "step": 475
595
+ },
596
+ {
597
+ "epoch": 2.8,
598
+ "learning_rate": 1.4385964912280704e-05,
599
+ "loss": 1.5627,
600
+ "step": 480
601
+ },
602
+ {
603
+ "epoch": 2.83,
604
+ "learning_rate": 1.432748538011696e-05,
605
+ "loss": 1.5465,
606
+ "step": 485
607
+ },
608
+ {
609
+ "epoch": 2.86,
610
+ "learning_rate": 1.4269005847953217e-05,
611
+ "loss": 1.5661,
612
+ "step": 490
613
+ },
614
+ {
615
+ "epoch": 2.89,
616
+ "learning_rate": 1.4210526315789475e-05,
617
+ "loss": 1.5996,
618
+ "step": 495
619
+ },
620
+ {
621
+ "epoch": 2.92,
622
+ "learning_rate": 1.4152046783625733e-05,
623
+ "loss": 1.5689,
624
+ "step": 500
625
+ },
626
+ {
627
+ "epoch": 2.95,
628
+ "learning_rate": 1.409356725146199e-05,
629
+ "loss": 1.5207,
630
+ "step": 505
631
+ },
632
+ {
633
+ "epoch": 2.98,
634
+ "learning_rate": 1.4035087719298246e-05,
635
+ "loss": 1.5289,
636
+ "step": 510
637
+ },
638
+ {
639
+ "epoch": 3.0,
640
+ "eval_loss": 1.4481685161590576,
641
+ "eval_runtime": 54.5707,
642
+ "eval_samples_per_second": 96.847,
643
+ "eval_steps_per_second": 3.042,
644
+ "step": 513
645
+ },
646
+ {
647
+ "epoch": 3.01,
648
+ "learning_rate": 1.3976608187134504e-05,
649
+ "loss": 1.5335,
650
+ "step": 515
651
+ },
652
+ {
653
+ "epoch": 3.04,
654
+ "learning_rate": 1.3918128654970762e-05,
655
+ "loss": 1.5218,
656
+ "step": 520
657
+ },
658
+ {
659
+ "epoch": 3.07,
660
+ "learning_rate": 1.385964912280702e-05,
661
+ "loss": 1.5529,
662
+ "step": 525
663
+ },
664
+ {
665
+ "epoch": 3.1,
666
+ "learning_rate": 1.3801169590643275e-05,
667
+ "loss": 1.5287,
668
+ "step": 530
669
+ },
670
+ {
671
+ "epoch": 3.13,
672
+ "learning_rate": 1.3742690058479533e-05,
673
+ "loss": 1.5382,
674
+ "step": 535
675
+ },
676
+ {
677
+ "epoch": 3.16,
678
+ "learning_rate": 1.3684210526315791e-05,
679
+ "loss": 1.5375,
680
+ "step": 540
681
+ },
682
+ {
683
+ "epoch": 3.18,
684
+ "learning_rate": 1.362573099415205e-05,
685
+ "loss": 1.5205,
686
+ "step": 545
687
+ },
688
+ {
689
+ "epoch": 3.21,
690
+ "learning_rate": 1.3567251461988304e-05,
691
+ "loss": 1.5213,
692
+ "step": 550
693
+ },
694
+ {
695
+ "epoch": 3.24,
696
+ "learning_rate": 1.3508771929824562e-05,
697
+ "loss": 1.5088,
698
+ "step": 555
699
+ },
700
+ {
701
+ "epoch": 3.27,
702
+ "learning_rate": 1.345029239766082e-05,
703
+ "loss": 1.4972,
704
+ "step": 560
705
+ },
706
+ {
707
+ "epoch": 3.3,
708
+ "learning_rate": 1.3391812865497079e-05,
709
+ "loss": 1.4984,
710
+ "step": 565
711
+ },
712
+ {
713
+ "epoch": 3.33,
714
+ "learning_rate": 1.3333333333333333e-05,
715
+ "loss": 1.5149,
716
+ "step": 570
717
+ },
718
+ {
719
+ "epoch": 3.36,
720
+ "learning_rate": 1.3274853801169591e-05,
721
+ "loss": 1.4956,
722
+ "step": 575
723
+ },
724
+ {
725
+ "epoch": 3.39,
726
+ "learning_rate": 1.321637426900585e-05,
727
+ "loss": 1.5321,
728
+ "step": 580
729
+ },
730
+ {
731
+ "epoch": 3.42,
732
+ "learning_rate": 1.3157894736842108e-05,
733
+ "loss": 1.5237,
734
+ "step": 585
735
+ },
736
+ {
737
+ "epoch": 3.45,
738
+ "learning_rate": 1.3099415204678362e-05,
739
+ "loss": 1.5124,
740
+ "step": 590
741
+ },
742
+ {
743
+ "epoch": 3.48,
744
+ "learning_rate": 1.304093567251462e-05,
745
+ "loss": 1.5087,
746
+ "step": 595
747
+ },
748
+ {
749
+ "epoch": 3.51,
750
+ "learning_rate": 1.2982456140350879e-05,
751
+ "loss": 1.5042,
752
+ "step": 600
753
+ },
754
+ {
755
+ "epoch": 3.54,
756
+ "learning_rate": 1.2923976608187137e-05,
757
+ "loss": 1.5324,
758
+ "step": 605
759
+ },
760
+ {
761
+ "epoch": 3.56,
762
+ "learning_rate": 1.2865497076023392e-05,
763
+ "loss": 1.5298,
764
+ "step": 610
765
+ },
766
+ {
767
+ "epoch": 3.59,
768
+ "learning_rate": 1.280701754385965e-05,
769
+ "loss": 1.5311,
770
+ "step": 615
771
+ },
772
+ {
773
+ "epoch": 3.62,
774
+ "learning_rate": 1.2748538011695908e-05,
775
+ "loss": 1.5229,
776
+ "step": 620
777
+ },
778
+ {
779
+ "epoch": 3.65,
780
+ "learning_rate": 1.2690058479532166e-05,
781
+ "loss": 1.4949,
782
+ "step": 625
783
+ },
784
+ {
785
+ "epoch": 3.68,
786
+ "learning_rate": 1.263157894736842e-05,
787
+ "loss": 1.5191,
788
+ "step": 630
789
+ },
790
+ {
791
+ "epoch": 3.71,
792
+ "learning_rate": 1.2573099415204679e-05,
793
+ "loss": 1.4942,
794
+ "step": 635
795
+ },
796
+ {
797
+ "epoch": 3.74,
798
+ "learning_rate": 1.2514619883040937e-05,
799
+ "loss": 1.4864,
800
+ "step": 640
801
+ },
802
+ {
803
+ "epoch": 3.77,
804
+ "learning_rate": 1.2456140350877195e-05,
805
+ "loss": 1.4959,
806
+ "step": 645
807
+ },
808
+ {
809
+ "epoch": 3.8,
810
+ "learning_rate": 1.239766081871345e-05,
811
+ "loss": 1.504,
812
+ "step": 650
813
+ },
814
+ {
815
+ "epoch": 3.83,
816
+ "learning_rate": 1.2339181286549708e-05,
817
+ "loss": 1.5097,
818
+ "step": 655
819
+ },
820
+ {
821
+ "epoch": 3.86,
822
+ "learning_rate": 1.2280701754385966e-05,
823
+ "loss": 1.5135,
824
+ "step": 660
825
+ },
826
+ {
827
+ "epoch": 3.89,
828
+ "learning_rate": 1.2222222222222224e-05,
829
+ "loss": 1.4982,
830
+ "step": 665
831
+ },
832
+ {
833
+ "epoch": 3.92,
834
+ "learning_rate": 1.216374269005848e-05,
835
+ "loss": 1.5007,
836
+ "step": 670
837
+ },
838
+ {
839
+ "epoch": 3.94,
840
+ "learning_rate": 1.2105263157894737e-05,
841
+ "loss": 1.5467,
842
+ "step": 675
843
+ },
844
+ {
845
+ "epoch": 3.97,
846
+ "learning_rate": 1.2046783625730995e-05,
847
+ "loss": 1.5082,
848
+ "step": 680
849
+ },
850
+ {
851
+ "epoch": 4.0,
852
+ "eval_loss": 1.4363151788711548,
853
+ "eval_runtime": 54.5283,
854
+ "eval_samples_per_second": 96.922,
855
+ "eval_steps_per_second": 3.044,
856
+ "step": 684
857
+ },
858
+ {
859
+ "epoch": 4.0,
860
+ "learning_rate": 1.1988304093567253e-05,
861
+ "loss": 1.5229,
862
+ "step": 685
863
+ },
864
+ {
865
+ "epoch": 4.03,
866
+ "learning_rate": 1.192982456140351e-05,
867
+ "loss": 1.458,
868
+ "step": 690
869
+ },
870
+ {
871
+ "epoch": 4.06,
872
+ "learning_rate": 1.1871345029239766e-05,
873
+ "loss": 1.4808,
874
+ "step": 695
875
+ },
876
+ {
877
+ "epoch": 4.09,
878
+ "learning_rate": 1.1812865497076024e-05,
879
+ "loss": 1.5026,
880
+ "step": 700
881
+ },
882
+ {
883
+ "epoch": 4.12,
884
+ "learning_rate": 1.1754385964912282e-05,
885
+ "loss": 1.4988,
886
+ "step": 705
887
+ },
888
+ {
889
+ "epoch": 4.15,
890
+ "learning_rate": 1.1695906432748539e-05,
891
+ "loss": 1.4884,
892
+ "step": 710
893
+ },
894
+ {
895
+ "epoch": 4.18,
896
+ "learning_rate": 1.1637426900584797e-05,
897
+ "loss": 1.4833,
898
+ "step": 715
899
+ },
900
+ {
901
+ "epoch": 4.21,
902
+ "learning_rate": 1.1578947368421053e-05,
903
+ "loss": 1.4669,
904
+ "step": 720
905
+ },
906
+ {
907
+ "epoch": 4.24,
908
+ "learning_rate": 1.1520467836257312e-05,
909
+ "loss": 1.4759,
910
+ "step": 725
911
+ },
912
+ {
913
+ "epoch": 4.27,
914
+ "learning_rate": 1.1461988304093568e-05,
915
+ "loss": 1.4444,
916
+ "step": 730
917
+ },
918
+ {
919
+ "epoch": 4.3,
920
+ "learning_rate": 1.1403508771929826e-05,
921
+ "loss": 1.4659,
922
+ "step": 735
923
+ },
924
+ {
925
+ "epoch": 4.32,
926
+ "learning_rate": 1.1345029239766083e-05,
927
+ "loss": 1.4745,
928
+ "step": 740
929
+ },
930
+ {
931
+ "epoch": 4.35,
932
+ "learning_rate": 1.128654970760234e-05,
933
+ "loss": 1.4783,
934
+ "step": 745
935
+ },
936
+ {
937
+ "epoch": 4.38,
938
+ "learning_rate": 1.1228070175438597e-05,
939
+ "loss": 1.4604,
940
+ "step": 750
941
+ },
942
+ {
943
+ "epoch": 4.41,
944
+ "learning_rate": 1.1169590643274855e-05,
945
+ "loss": 1.467,
946
+ "step": 755
947
+ },
948
+ {
949
+ "epoch": 4.44,
950
+ "learning_rate": 1.1111111111111113e-05,
951
+ "loss": 1.486,
952
+ "step": 760
953
+ },
954
+ {
955
+ "epoch": 4.47,
956
+ "learning_rate": 1.105263157894737e-05,
957
+ "loss": 1.4789,
958
+ "step": 765
959
+ },
960
+ {
961
+ "epoch": 4.5,
962
+ "learning_rate": 1.0994152046783626e-05,
963
+ "loss": 1.4938,
964
+ "step": 770
965
+ },
966
+ {
967
+ "epoch": 4.53,
968
+ "learning_rate": 1.0935672514619884e-05,
969
+ "loss": 1.4702,
970
+ "step": 775
971
+ },
972
+ {
973
+ "epoch": 4.56,
974
+ "learning_rate": 1.0877192982456142e-05,
975
+ "loss": 1.4938,
976
+ "step": 780
977
+ },
978
+ {
979
+ "epoch": 4.59,
980
+ "learning_rate": 1.0818713450292399e-05,
981
+ "loss": 1.4807,
982
+ "step": 785
983
+ },
984
+ {
985
+ "epoch": 4.62,
986
+ "learning_rate": 1.0760233918128655e-05,
987
+ "loss": 1.487,
988
+ "step": 790
989
+ },
990
+ {
991
+ "epoch": 4.65,
992
+ "learning_rate": 1.0701754385964913e-05,
993
+ "loss": 1.4808,
994
+ "step": 795
995
+ },
996
+ {
997
+ "epoch": 4.67,
998
+ "learning_rate": 1.0643274853801172e-05,
999
+ "loss": 1.4707,
1000
+ "step": 800
1001
+ },
1002
+ {
1003
+ "epoch": 4.7,
1004
+ "learning_rate": 1.0584795321637428e-05,
1005
+ "loss": 1.4748,
1006
+ "step": 805
1007
+ },
1008
+ {
1009
+ "epoch": 4.73,
1010
+ "learning_rate": 1.0526315789473684e-05,
1011
+ "loss": 1.4728,
1012
+ "step": 810
1013
+ },
1014
+ {
1015
+ "epoch": 4.76,
1016
+ "learning_rate": 1.0467836257309943e-05,
1017
+ "loss": 1.4848,
1018
+ "step": 815
1019
+ },
1020
+ {
1021
+ "epoch": 4.79,
1022
+ "learning_rate": 1.04093567251462e-05,
1023
+ "loss": 1.4664,
1024
+ "step": 820
1025
+ },
1026
+ {
1027
+ "epoch": 4.82,
1028
+ "learning_rate": 1.0350877192982459e-05,
1029
+ "loss": 1.4639,
1030
+ "step": 825
1031
+ },
1032
+ {
1033
+ "epoch": 4.85,
1034
+ "learning_rate": 1.0292397660818714e-05,
1035
+ "loss": 1.5213,
1036
+ "step": 830
1037
+ },
1038
+ {
1039
+ "epoch": 4.88,
1040
+ "learning_rate": 1.0233918128654972e-05,
1041
+ "loss": 1.4455,
1042
+ "step": 835
1043
+ },
1044
+ {
1045
+ "epoch": 4.91,
1046
+ "learning_rate": 1.017543859649123e-05,
1047
+ "loss": 1.5129,
1048
+ "step": 840
1049
+ },
1050
+ {
1051
+ "epoch": 4.94,
1052
+ "learning_rate": 1.0116959064327488e-05,
1053
+ "loss": 1.4527,
1054
+ "step": 845
1055
+ },
1056
+ {
1057
+ "epoch": 4.97,
1058
+ "learning_rate": 1.0058479532163743e-05,
1059
+ "loss": 1.4543,
1060
+ "step": 850
1061
+ },
1062
+ {
1063
+ "epoch": 5.0,
1064
+ "learning_rate": 1e-05,
1065
+ "loss": 1.4782,
1066
+ "step": 855
1067
+ },
1068
+ {
1069
+ "epoch": 5.0,
1070
+ "eval_loss": 1.4285916090011597,
1071
+ "eval_runtime": 54.3492,
1072
+ "eval_samples_per_second": 97.242,
1073
+ "eval_steps_per_second": 3.054,
1074
+ "step": 855
1075
+ },
1076
+ {
1077
+ "epoch": 5.03,
1078
+ "learning_rate": 9.941520467836257e-06,
1079
+ "loss": 1.4358,
1080
+ "step": 860
1081
+ },
1082
+ {
1083
+ "epoch": 5.05,
1084
+ "learning_rate": 9.883040935672515e-06,
1085
+ "loss": 1.4702,
1086
+ "step": 865
1087
+ },
1088
+ {
1089
+ "epoch": 5.08,
1090
+ "learning_rate": 9.824561403508772e-06,
1091
+ "loss": 1.4468,
1092
+ "step": 870
1093
+ },
1094
+ {
1095
+ "epoch": 5.11,
1096
+ "learning_rate": 9.76608187134503e-06,
1097
+ "loss": 1.4365,
1098
+ "step": 875
1099
+ },
1100
+ {
1101
+ "epoch": 5.14,
1102
+ "learning_rate": 9.707602339181286e-06,
1103
+ "loss": 1.462,
1104
+ "step": 880
1105
+ },
1106
+ {
1107
+ "epoch": 5.17,
1108
+ "learning_rate": 9.649122807017545e-06,
1109
+ "loss": 1.4665,
1110
+ "step": 885
1111
+ },
1112
+ {
1113
+ "epoch": 5.2,
1114
+ "learning_rate": 9.590643274853801e-06,
1115
+ "loss": 1.4579,
1116
+ "step": 890
1117
+ },
1118
+ {
1119
+ "epoch": 5.23,
1120
+ "learning_rate": 9.532163742690059e-06,
1121
+ "loss": 1.445,
1122
+ "step": 895
1123
+ },
1124
+ {
1125
+ "epoch": 5.26,
1126
+ "learning_rate": 9.473684210526315e-06,
1127
+ "loss": 1.4563,
1128
+ "step": 900
1129
+ },
1130
+ {
1131
+ "epoch": 5.29,
1132
+ "learning_rate": 9.415204678362574e-06,
1133
+ "loss": 1.4286,
1134
+ "step": 905
1135
+ },
1136
+ {
1137
+ "epoch": 5.32,
1138
+ "learning_rate": 9.35672514619883e-06,
1139
+ "loss": 1.4753,
1140
+ "step": 910
1141
+ },
1142
+ {
1143
+ "epoch": 5.35,
1144
+ "learning_rate": 9.298245614035088e-06,
1145
+ "loss": 1.4628,
1146
+ "step": 915
1147
+ },
1148
+ {
1149
+ "epoch": 5.38,
1150
+ "learning_rate": 9.239766081871345e-06,
1151
+ "loss": 1.4685,
1152
+ "step": 920
1153
+ },
1154
+ {
1155
+ "epoch": 5.41,
1156
+ "learning_rate": 9.181286549707603e-06,
1157
+ "loss": 1.4559,
1158
+ "step": 925
1159
+ },
1160
+ {
1161
+ "epoch": 5.43,
1162
+ "learning_rate": 9.12280701754386e-06,
1163
+ "loss": 1.435,
1164
+ "step": 930
1165
+ },
1166
+ {
1167
+ "epoch": 5.46,
1168
+ "learning_rate": 9.064327485380117e-06,
1169
+ "loss": 1.4272,
1170
+ "step": 935
1171
+ },
1172
+ {
1173
+ "epoch": 5.49,
1174
+ "learning_rate": 9.005847953216374e-06,
1175
+ "loss": 1.4592,
1176
+ "step": 940
1177
+ },
1178
+ {
1179
+ "epoch": 5.52,
1180
+ "learning_rate": 8.947368421052632e-06,
1181
+ "loss": 1.4264,
1182
+ "step": 945
1183
+ },
1184
+ {
1185
+ "epoch": 5.55,
1186
+ "learning_rate": 8.888888888888888e-06,
1187
+ "loss": 1.4329,
1188
+ "step": 950
1189
+ },
1190
+ {
1191
+ "epoch": 5.58,
1192
+ "learning_rate": 8.830409356725146e-06,
1193
+ "loss": 1.4368,
1194
+ "step": 955
1195
+ },
1196
+ {
1197
+ "epoch": 5.61,
1198
+ "learning_rate": 8.771929824561405e-06,
1199
+ "loss": 1.4481,
1200
+ "step": 960
1201
+ },
1202
+ {
1203
+ "epoch": 5.64,
1204
+ "learning_rate": 8.713450292397661e-06,
1205
+ "loss": 1.458,
1206
+ "step": 965
1207
+ },
1208
+ {
1209
+ "epoch": 5.67,
1210
+ "learning_rate": 8.654970760233919e-06,
1211
+ "loss": 1.4302,
1212
+ "step": 970
1213
+ },
1214
+ {
1215
+ "epoch": 5.7,
1216
+ "learning_rate": 8.596491228070176e-06,
1217
+ "loss": 1.4269,
1218
+ "step": 975
1219
+ },
1220
+ {
1221
+ "epoch": 5.73,
1222
+ "learning_rate": 8.538011695906434e-06,
1223
+ "loss": 1.4743,
1224
+ "step": 980
1225
+ },
1226
+ {
1227
+ "epoch": 5.76,
1228
+ "learning_rate": 8.47953216374269e-06,
1229
+ "loss": 1.4397,
1230
+ "step": 985
1231
+ },
1232
+ {
1233
+ "epoch": 5.79,
1234
+ "learning_rate": 8.421052631578948e-06,
1235
+ "loss": 1.4537,
1236
+ "step": 990
1237
+ },
1238
+ {
1239
+ "epoch": 5.81,
1240
+ "learning_rate": 8.362573099415205e-06,
1241
+ "loss": 1.4483,
1242
+ "step": 995
1243
+ },
1244
+ {
1245
+ "epoch": 5.84,
1246
+ "learning_rate": 8.304093567251463e-06,
1247
+ "loss": 1.479,
1248
+ "step": 1000
1249
+ },
1250
+ {
1251
+ "epoch": 5.87,
1252
+ "learning_rate": 8.24561403508772e-06,
1253
+ "loss": 1.4238,
1254
+ "step": 1005
1255
+ },
1256
+ {
1257
+ "epoch": 5.9,
1258
+ "learning_rate": 8.187134502923977e-06,
1259
+ "loss": 1.4544,
1260
+ "step": 1010
1261
+ },
1262
+ {
1263
+ "epoch": 5.93,
1264
+ "learning_rate": 8.128654970760235e-06,
1265
+ "loss": 1.4557,
1266
+ "step": 1015
1267
+ },
1268
+ {
1269
+ "epoch": 5.96,
1270
+ "learning_rate": 8.070175438596492e-06,
1271
+ "loss": 1.4514,
1272
+ "step": 1020
1273
+ },
1274
+ {
1275
+ "epoch": 5.99,
1276
+ "learning_rate": 8.01169590643275e-06,
1277
+ "loss": 1.4084,
1278
+ "step": 1025
1279
+ },
1280
+ {
1281
+ "epoch": 6.0,
1282
+ "eval_loss": 1.4264638423919678,
1283
+ "eval_runtime": 54.613,
1284
+ "eval_samples_per_second": 96.772,
1285
+ "eval_steps_per_second": 3.04,
1286
+ "step": 1026
1287
+ },
1288
+ {
1289
+ "epoch": 6.02,
1290
+ "learning_rate": 7.953216374269006e-06,
1291
+ "loss": 1.4231,
1292
+ "step": 1030
1293
+ },
1294
+ {
1295
+ "epoch": 6.05,
1296
+ "learning_rate": 7.894736842105265e-06,
1297
+ "loss": 1.4327,
1298
+ "step": 1035
1299
+ },
1300
+ {
1301
+ "epoch": 6.08,
1302
+ "learning_rate": 7.836257309941521e-06,
1303
+ "loss": 1.4188,
1304
+ "step": 1040
1305
+ },
1306
+ {
1307
+ "epoch": 6.11,
1308
+ "learning_rate": 7.77777777777778e-06,
1309
+ "loss": 1.4453,
1310
+ "step": 1045
1311
+ },
1312
+ {
1313
+ "epoch": 6.14,
1314
+ "learning_rate": 7.719298245614036e-06,
1315
+ "loss": 1.4,
1316
+ "step": 1050
1317
+ },
1318
+ {
1319
+ "epoch": 6.17,
1320
+ "learning_rate": 7.660818713450294e-06,
1321
+ "loss": 1.3889,
1322
+ "step": 1055
1323
+ },
1324
+ {
1325
+ "epoch": 6.19,
1326
+ "learning_rate": 7.60233918128655e-06,
1327
+ "loss": 1.4013,
1328
+ "step": 1060
1329
+ },
1330
+ {
1331
+ "epoch": 6.22,
1332
+ "learning_rate": 7.5438596491228074e-06,
1333
+ "loss": 1.4102,
1334
+ "step": 1065
1335
+ },
1336
+ {
1337
+ "epoch": 6.25,
1338
+ "learning_rate": 7.485380116959065e-06,
1339
+ "loss": 1.3953,
1340
+ "step": 1070
1341
+ },
1342
+ {
1343
+ "epoch": 6.28,
1344
+ "learning_rate": 7.426900584795322e-06,
1345
+ "loss": 1.4428,
1346
+ "step": 1075
1347
+ },
1348
+ {
1349
+ "epoch": 6.31,
1350
+ "learning_rate": 7.368421052631579e-06,
1351
+ "loss": 1.4446,
1352
+ "step": 1080
1353
+ },
1354
+ {
1355
+ "epoch": 6.34,
1356
+ "learning_rate": 7.309941520467837e-06,
1357
+ "loss": 1.4221,
1358
+ "step": 1085
1359
+ },
1360
+ {
1361
+ "epoch": 6.37,
1362
+ "learning_rate": 7.251461988304094e-06,
1363
+ "loss": 1.437,
1364
+ "step": 1090
1365
+ },
1366
+ {
1367
+ "epoch": 6.4,
1368
+ "learning_rate": 7.192982456140352e-06,
1369
+ "loss": 1.4144,
1370
+ "step": 1095
1371
+ },
1372
+ {
1373
+ "epoch": 6.43,
1374
+ "learning_rate": 7.134502923976608e-06,
1375
+ "loss": 1.4665,
1376
+ "step": 1100
1377
+ },
1378
+ {
1379
+ "epoch": 6.46,
1380
+ "learning_rate": 7.0760233918128665e-06,
1381
+ "loss": 1.4175,
1382
+ "step": 1105
1383
+ },
1384
+ {
1385
+ "epoch": 6.49,
1386
+ "learning_rate": 7.017543859649123e-06,
1387
+ "loss": 1.4346,
1388
+ "step": 1110
1389
+ },
1390
+ {
1391
+ "epoch": 6.52,
1392
+ "learning_rate": 6.959064327485381e-06,
1393
+ "loss": 1.422,
1394
+ "step": 1115
1395
+ },
1396
+ {
1397
+ "epoch": 6.54,
1398
+ "learning_rate": 6.9005847953216375e-06,
1399
+ "loss": 1.4322,
1400
+ "step": 1120
1401
+ },
1402
+ {
1403
+ "epoch": 6.57,
1404
+ "learning_rate": 6.842105263157896e-06,
1405
+ "loss": 1.4421,
1406
+ "step": 1125
1407
+ },
1408
+ {
1409
+ "epoch": 6.6,
1410
+ "learning_rate": 6.783625730994152e-06,
1411
+ "loss": 1.4346,
1412
+ "step": 1130
1413
+ },
1414
+ {
1415
+ "epoch": 6.63,
1416
+ "learning_rate": 6.72514619883041e-06,
1417
+ "loss": 1.4141,
1418
+ "step": 1135
1419
+ },
1420
+ {
1421
+ "epoch": 6.66,
1422
+ "learning_rate": 6.666666666666667e-06,
1423
+ "loss": 1.4387,
1424
+ "step": 1140
1425
+ },
1426
+ {
1427
+ "epoch": 6.69,
1428
+ "learning_rate": 6.608187134502925e-06,
1429
+ "loss": 1.422,
1430
+ "step": 1145
1431
+ },
1432
+ {
1433
+ "epoch": 6.72,
1434
+ "learning_rate": 6.549707602339181e-06,
1435
+ "loss": 1.4172,
1436
+ "step": 1150
1437
+ },
1438
+ {
1439
+ "epoch": 6.75,
1440
+ "learning_rate": 6.491228070175439e-06,
1441
+ "loss": 1.4387,
1442
+ "step": 1155
1443
+ },
1444
+ {
1445
+ "epoch": 6.78,
1446
+ "learning_rate": 6.432748538011696e-06,
1447
+ "loss": 1.4121,
1448
+ "step": 1160
1449
+ },
1450
+ {
1451
+ "epoch": 6.81,
1452
+ "learning_rate": 6.374269005847954e-06,
1453
+ "loss": 1.4065,
1454
+ "step": 1165
1455
+ },
1456
+ {
1457
+ "epoch": 6.84,
1458
+ "learning_rate": 6.31578947368421e-06,
1459
+ "loss": 1.4487,
1460
+ "step": 1170
1461
+ },
1462
+ {
1463
+ "epoch": 6.87,
1464
+ "learning_rate": 6.2573099415204685e-06,
1465
+ "loss": 1.4362,
1466
+ "step": 1175
1467
+ },
1468
+ {
1469
+ "epoch": 6.9,
1470
+ "learning_rate": 6.198830409356725e-06,
1471
+ "loss": 1.4351,
1472
+ "step": 1180
1473
+ },
1474
+ {
1475
+ "epoch": 6.92,
1476
+ "learning_rate": 6.140350877192983e-06,
1477
+ "loss": 1.4378,
1478
+ "step": 1185
1479
+ },
1480
+ {
1481
+ "epoch": 6.95,
1482
+ "learning_rate": 6.08187134502924e-06,
1483
+ "loss": 1.3944,
1484
+ "step": 1190
1485
+ },
1486
+ {
1487
+ "epoch": 6.98,
1488
+ "learning_rate": 6.023391812865498e-06,
1489
+ "loss": 1.4229,
1490
+ "step": 1195
1491
+ },
1492
+ {
1493
+ "epoch": 6.99,
1494
+ "eval_loss": 1.4238632917404175,
1495
+ "eval_runtime": 54.6138,
1496
+ "eval_samples_per_second": 96.77,
1497
+ "eval_steps_per_second": 3.04,
1498
+ "step": 1197
1499
+ },
1500
+ {
1501
+ "epoch": 7.01,
1502
+ "learning_rate": 5.964912280701755e-06,
1503
+ "loss": 1.4347,
1504
+ "step": 1200
1505
+ },
1506
+ {
1507
+ "epoch": 7.04,
1508
+ "learning_rate": 5.906432748538012e-06,
1509
+ "loss": 1.3752,
1510
+ "step": 1205
1511
+ },
1512
+ {
1513
+ "epoch": 7.07,
1514
+ "learning_rate": 5.847953216374269e-06,
1515
+ "loss": 1.4408,
1516
+ "step": 1210
1517
+ },
1518
+ {
1519
+ "epoch": 7.1,
1520
+ "learning_rate": 5.789473684210527e-06,
1521
+ "loss": 1.3978,
1522
+ "step": 1215
1523
+ },
1524
+ {
1525
+ "epoch": 7.13,
1526
+ "learning_rate": 5.730994152046784e-06,
1527
+ "loss": 1.3911,
1528
+ "step": 1220
1529
+ },
1530
+ {
1531
+ "epoch": 7.16,
1532
+ "learning_rate": 5.672514619883041e-06,
1533
+ "loss": 1.4236,
1534
+ "step": 1225
1535
+ },
1536
+ {
1537
+ "epoch": 7.19,
1538
+ "learning_rate": 5.6140350877192985e-06,
1539
+ "loss": 1.396,
1540
+ "step": 1230
1541
+ },
1542
+ {
1543
+ "epoch": 7.22,
1544
+ "learning_rate": 5.555555555555557e-06,
1545
+ "loss": 1.3968,
1546
+ "step": 1235
1547
+ },
1548
+ {
1549
+ "epoch": 7.25,
1550
+ "learning_rate": 5.497076023391813e-06,
1551
+ "loss": 1.411,
1552
+ "step": 1240
1553
+ },
1554
+ {
1555
+ "epoch": 7.28,
1556
+ "learning_rate": 5.438596491228071e-06,
1557
+ "loss": 1.4243,
1558
+ "step": 1245
1559
+ },
1560
+ {
1561
+ "epoch": 7.3,
1562
+ "learning_rate": 5.380116959064328e-06,
1563
+ "loss": 1.3972,
1564
+ "step": 1250
1565
+ },
1566
+ {
1567
+ "epoch": 7.33,
1568
+ "learning_rate": 5.321637426900586e-06,
1569
+ "loss": 1.4079,
1570
+ "step": 1255
1571
+ },
1572
+ {
1573
+ "epoch": 7.36,
1574
+ "learning_rate": 5.263157894736842e-06,
1575
+ "loss": 1.4157,
1576
+ "step": 1260
1577
+ },
1578
+ {
1579
+ "epoch": 7.39,
1580
+ "learning_rate": 5.2046783625731e-06,
1581
+ "loss": 1.3847,
1582
+ "step": 1265
1583
+ },
1584
+ {
1585
+ "epoch": 7.42,
1586
+ "learning_rate": 5.146198830409357e-06,
1587
+ "loss": 1.4167,
1588
+ "step": 1270
1589
+ },
1590
+ {
1591
+ "epoch": 7.45,
1592
+ "learning_rate": 5.087719298245615e-06,
1593
+ "loss": 1.4167,
1594
+ "step": 1275
1595
+ },
1596
+ {
1597
+ "epoch": 7.48,
1598
+ "learning_rate": 5.029239766081871e-06,
1599
+ "loss": 1.3941,
1600
+ "step": 1280
1601
+ },
1602
+ {
1603
+ "epoch": 7.51,
1604
+ "learning_rate": 4.970760233918129e-06,
1605
+ "loss": 1.4131,
1606
+ "step": 1285
1607
+ },
1608
+ {
1609
+ "epoch": 7.54,
1610
+ "learning_rate": 4.912280701754386e-06,
1611
+ "loss": 1.3771,
1612
+ "step": 1290
1613
+ },
1614
+ {
1615
+ "epoch": 7.57,
1616
+ "learning_rate": 4.853801169590643e-06,
1617
+ "loss": 1.395,
1618
+ "step": 1295
1619
+ },
1620
+ {
1621
+ "epoch": 7.6,
1622
+ "learning_rate": 4.7953216374269005e-06,
1623
+ "loss": 1.4258,
1624
+ "step": 1300
1625
+ },
1626
+ {
1627
+ "epoch": 7.63,
1628
+ "learning_rate": 4.736842105263158e-06,
1629
+ "loss": 1.3951,
1630
+ "step": 1305
1631
+ },
1632
+ {
1633
+ "epoch": 7.66,
1634
+ "learning_rate": 4.678362573099415e-06,
1635
+ "loss": 1.4218,
1636
+ "step": 1310
1637
+ },
1638
+ {
1639
+ "epoch": 7.68,
1640
+ "learning_rate": 4.619883040935672e-06,
1641
+ "loss": 1.4144,
1642
+ "step": 1315
1643
+ },
1644
+ {
1645
+ "epoch": 7.71,
1646
+ "learning_rate": 4.56140350877193e-06,
1647
+ "loss": 1.4083,
1648
+ "step": 1320
1649
+ },
1650
+ {
1651
+ "epoch": 7.74,
1652
+ "learning_rate": 4.502923976608187e-06,
1653
+ "loss": 1.4303,
1654
+ "step": 1325
1655
+ },
1656
+ {
1657
+ "epoch": 7.77,
1658
+ "learning_rate": 4.444444444444444e-06,
1659
+ "loss": 1.3925,
1660
+ "step": 1330
1661
+ },
1662
+ {
1663
+ "epoch": 7.8,
1664
+ "learning_rate": 4.385964912280702e-06,
1665
+ "loss": 1.3988,
1666
+ "step": 1335
1667
+ },
1668
+ {
1669
+ "epoch": 7.83,
1670
+ "learning_rate": 4.3274853801169596e-06,
1671
+ "loss": 1.4123,
1672
+ "step": 1340
1673
+ },
1674
+ {
1675
+ "epoch": 7.86,
1676
+ "learning_rate": 4.269005847953217e-06,
1677
+ "loss": 1.4339,
1678
+ "step": 1345
1679
+ },
1680
+ {
1681
+ "epoch": 7.89,
1682
+ "learning_rate": 4.210526315789474e-06,
1683
+ "loss": 1.4151,
1684
+ "step": 1350
1685
+ },
1686
+ {
1687
+ "epoch": 7.92,
1688
+ "learning_rate": 4.152046783625731e-06,
1689
+ "loss": 1.4058,
1690
+ "step": 1355
1691
+ },
1692
+ {
1693
+ "epoch": 7.95,
1694
+ "learning_rate": 4.093567251461989e-06,
1695
+ "loss": 1.4106,
1696
+ "step": 1360
1697
+ },
1698
+ {
1699
+ "epoch": 7.98,
1700
+ "learning_rate": 4.035087719298246e-06,
1701
+ "loss": 1.4,
1702
+ "step": 1365
1703
+ },
1704
+ {
1705
+ "epoch": 8.0,
1706
+ "eval_loss": 1.421057105064392,
1707
+ "eval_runtime": 54.4659,
1708
+ "eval_samples_per_second": 97.033,
1709
+ "eval_steps_per_second": 3.048,
1710
+ "step": 1369
1711
+ },
1712
+ {
1713
+ "epoch": 8.01,
1714
+ "learning_rate": 3.976608187134503e-06,
1715
+ "loss": 1.444,
1716
+ "step": 1370
1717
+ },
1718
+ {
1719
+ "epoch": 8.04,
1720
+ "learning_rate": 3.9181286549707605e-06,
1721
+ "loss": 1.3986,
1722
+ "step": 1375
1723
+ },
1724
+ {
1725
+ "epoch": 8.06,
1726
+ "learning_rate": 3.859649122807018e-06,
1727
+ "loss": 1.3945,
1728
+ "step": 1380
1729
+ },
1730
+ {
1731
+ "epoch": 8.09,
1732
+ "learning_rate": 3.801169590643275e-06,
1733
+ "loss": 1.4086,
1734
+ "step": 1385
1735
+ },
1736
+ {
1737
+ "epoch": 8.12,
1738
+ "learning_rate": 3.7426900584795324e-06,
1739
+ "loss": 1.4037,
1740
+ "step": 1390
1741
+ },
1742
+ {
1743
+ "epoch": 8.15,
1744
+ "learning_rate": 3.6842105263157896e-06,
1745
+ "loss": 1.4306,
1746
+ "step": 1395
1747
+ },
1748
+ {
1749
+ "epoch": 8.18,
1750
+ "learning_rate": 3.625730994152047e-06,
1751
+ "loss": 1.4049,
1752
+ "step": 1400
1753
+ },
1754
+ {
1755
+ "epoch": 8.21,
1756
+ "learning_rate": 3.567251461988304e-06,
1757
+ "loss": 1.3857,
1758
+ "step": 1405
1759
+ },
1760
+ {
1761
+ "epoch": 8.24,
1762
+ "learning_rate": 3.5087719298245615e-06,
1763
+ "loss": 1.3903,
1764
+ "step": 1410
1765
+ },
1766
+ {
1767
+ "epoch": 8.27,
1768
+ "learning_rate": 3.4502923976608188e-06,
1769
+ "loss": 1.4029,
1770
+ "step": 1415
1771
+ },
1772
+ {
1773
+ "epoch": 8.3,
1774
+ "learning_rate": 3.391812865497076e-06,
1775
+ "loss": 1.3956,
1776
+ "step": 1420
1777
+ },
1778
+ {
1779
+ "epoch": 8.33,
1780
+ "learning_rate": 3.3333333333333333e-06,
1781
+ "loss": 1.3745,
1782
+ "step": 1425
1783
+ },
1784
+ {
1785
+ "epoch": 8.36,
1786
+ "learning_rate": 3.2748538011695906e-06,
1787
+ "loss": 1.394,
1788
+ "step": 1430
1789
+ },
1790
+ {
1791
+ "epoch": 8.39,
1792
+ "learning_rate": 3.216374269005848e-06,
1793
+ "loss": 1.3947,
1794
+ "step": 1435
1795
+ },
1796
+ {
1797
+ "epoch": 8.41,
1798
+ "learning_rate": 3.157894736842105e-06,
1799
+ "loss": 1.396,
1800
+ "step": 1440
1801
+ },
1802
+ {
1803
+ "epoch": 8.44,
1804
+ "learning_rate": 3.0994152046783624e-06,
1805
+ "loss": 1.4008,
1806
+ "step": 1445
1807
+ },
1808
+ {
1809
+ "epoch": 8.47,
1810
+ "learning_rate": 3.04093567251462e-06,
1811
+ "loss": 1.3946,
1812
+ "step": 1450
1813
+ },
1814
+ {
1815
+ "epoch": 8.5,
1816
+ "learning_rate": 2.9824561403508774e-06,
1817
+ "loss": 1.3897,
1818
+ "step": 1455
1819
+ },
1820
+ {
1821
+ "epoch": 8.53,
1822
+ "learning_rate": 2.9239766081871347e-06,
1823
+ "loss": 1.4241,
1824
+ "step": 1460
1825
+ },
1826
+ {
1827
+ "epoch": 8.56,
1828
+ "learning_rate": 2.865497076023392e-06,
1829
+ "loss": 1.3892,
1830
+ "step": 1465
1831
+ },
1832
+ {
1833
+ "epoch": 8.59,
1834
+ "learning_rate": 2.8070175438596493e-06,
1835
+ "loss": 1.3806,
1836
+ "step": 1470
1837
+ },
1838
+ {
1839
+ "epoch": 8.62,
1840
+ "learning_rate": 2.7485380116959066e-06,
1841
+ "loss": 1.3899,
1842
+ "step": 1475
1843
+ },
1844
+ {
1845
+ "epoch": 8.65,
1846
+ "learning_rate": 2.690058479532164e-06,
1847
+ "loss": 1.3768,
1848
+ "step": 1480
1849
+ },
1850
+ {
1851
+ "epoch": 8.68,
1852
+ "learning_rate": 2.631578947368421e-06,
1853
+ "loss": 1.4009,
1854
+ "step": 1485
1855
+ },
1856
+ {
1857
+ "epoch": 8.71,
1858
+ "learning_rate": 2.5730994152046784e-06,
1859
+ "loss": 1.3709,
1860
+ "step": 1490
1861
+ },
1862
+ {
1863
+ "epoch": 8.74,
1864
+ "learning_rate": 2.5146198830409357e-06,
1865
+ "loss": 1.4098,
1866
+ "step": 1495
1867
+ },
1868
+ {
1869
+ "epoch": 8.77,
1870
+ "learning_rate": 2.456140350877193e-06,
1871
+ "loss": 1.407,
1872
+ "step": 1500
1873
+ },
1874
+ {
1875
+ "epoch": 8.79,
1876
+ "learning_rate": 2.3976608187134502e-06,
1877
+ "loss": 1.3903,
1878
+ "step": 1505
1879
+ },
1880
+ {
1881
+ "epoch": 8.82,
1882
+ "learning_rate": 2.3391812865497075e-06,
1883
+ "loss": 1.3892,
1884
+ "step": 1510
1885
+ },
1886
+ {
1887
+ "epoch": 8.85,
1888
+ "learning_rate": 2.280701754385965e-06,
1889
+ "loss": 1.3618,
1890
+ "step": 1515
1891
+ },
1892
+ {
1893
+ "epoch": 8.88,
1894
+ "learning_rate": 2.222222222222222e-06,
1895
+ "loss": 1.4114,
1896
+ "step": 1520
1897
+ },
1898
+ {
1899
+ "epoch": 8.91,
1900
+ "learning_rate": 2.1637426900584798e-06,
1901
+ "loss": 1.3862,
1902
+ "step": 1525
1903
+ },
1904
+ {
1905
+ "epoch": 8.94,
1906
+ "learning_rate": 2.105263157894737e-06,
1907
+ "loss": 1.3997,
1908
+ "step": 1530
1909
+ },
1910
+ {
1911
+ "epoch": 8.97,
1912
+ "learning_rate": 2.0467836257309943e-06,
1913
+ "loss": 1.4109,
1914
+ "step": 1535
1915
+ },
1916
+ {
1917
+ "epoch": 9.0,
1918
+ "learning_rate": 1.9883040935672516e-06,
1919
+ "loss": 1.3865,
1920
+ "step": 1540
1921
+ },
1922
+ {
1923
+ "epoch": 9.0,
1924
+ "eval_loss": 1.4214701652526855,
1925
+ "eval_runtime": 54.5568,
1926
+ "eval_samples_per_second": 96.871,
1927
+ "eval_steps_per_second": 3.043,
1928
+ "step": 1540
1929
+ },
1930
+ {
1931
+ "epoch": 9.03,
1932
+ "learning_rate": 1.929824561403509e-06,
1933
+ "loss": 1.3806,
1934
+ "step": 1545
1935
+ },
1936
+ {
1937
+ "epoch": 9.06,
1938
+ "learning_rate": 1.8713450292397662e-06,
1939
+ "loss": 1.3961,
1940
+ "step": 1550
1941
+ },
1942
+ {
1943
+ "epoch": 9.09,
1944
+ "learning_rate": 1.8128654970760235e-06,
1945
+ "loss": 1.3759,
1946
+ "step": 1555
1947
+ },
1948
+ {
1949
+ "epoch": 9.12,
1950
+ "learning_rate": 1.7543859649122807e-06,
1951
+ "loss": 1.3744,
1952
+ "step": 1560
1953
+ },
1954
+ {
1955
+ "epoch": 9.15,
1956
+ "learning_rate": 1.695906432748538e-06,
1957
+ "loss": 1.401,
1958
+ "step": 1565
1959
+ },
1960
+ {
1961
+ "epoch": 9.17,
1962
+ "learning_rate": 1.6374269005847953e-06,
1963
+ "loss": 1.3915,
1964
+ "step": 1570
1965
+ },
1966
+ {
1967
+ "epoch": 9.2,
1968
+ "learning_rate": 1.5789473684210526e-06,
1969
+ "loss": 1.4001,
1970
+ "step": 1575
1971
+ },
1972
+ {
1973
+ "epoch": 9.23,
1974
+ "learning_rate": 1.52046783625731e-06,
1975
+ "loss": 1.3885,
1976
+ "step": 1580
1977
+ },
1978
+ {
1979
+ "epoch": 9.26,
1980
+ "learning_rate": 1.4619883040935674e-06,
1981
+ "loss": 1.3917,
1982
+ "step": 1585
1983
+ },
1984
+ {
1985
+ "epoch": 9.29,
1986
+ "learning_rate": 1.4035087719298246e-06,
1987
+ "loss": 1.4041,
1988
+ "step": 1590
1989
+ },
1990
+ {
1991
+ "epoch": 9.32,
1992
+ "learning_rate": 1.345029239766082e-06,
1993
+ "loss": 1.397,
1994
+ "step": 1595
1995
+ },
1996
+ {
1997
+ "epoch": 9.35,
1998
+ "learning_rate": 1.2865497076023392e-06,
1999
+ "loss": 1.3874,
2000
+ "step": 1600
2001
+ },
2002
+ {
2003
+ "epoch": 9.38,
2004
+ "learning_rate": 1.2280701754385965e-06,
2005
+ "loss": 1.3742,
2006
+ "step": 1605
2007
+ },
2008
+ {
2009
+ "epoch": 9.41,
2010
+ "learning_rate": 1.1695906432748538e-06,
2011
+ "loss": 1.3782,
2012
+ "step": 1610
2013
+ },
2014
+ {
2015
+ "epoch": 9.44,
2016
+ "learning_rate": 1.111111111111111e-06,
2017
+ "loss": 1.3787,
2018
+ "step": 1615
2019
+ },
2020
+ {
2021
+ "epoch": 9.47,
2022
+ "learning_rate": 1.0526315789473685e-06,
2023
+ "loss": 1.3616,
2024
+ "step": 1620
2025
+ },
2026
+ {
2027
+ "epoch": 9.5,
2028
+ "learning_rate": 9.941520467836258e-07,
2029
+ "loss": 1.3762,
2030
+ "step": 1625
2031
+ },
2032
+ {
2033
+ "epoch": 9.53,
2034
+ "learning_rate": 9.356725146198831e-07,
2035
+ "loss": 1.3793,
2036
+ "step": 1630
2037
+ },
2038
+ {
2039
+ "epoch": 9.55,
2040
+ "learning_rate": 8.771929824561404e-07,
2041
+ "loss": 1.3743,
2042
+ "step": 1635
2043
+ },
2044
+ {
2045
+ "epoch": 9.58,
2046
+ "learning_rate": 8.187134502923977e-07,
2047
+ "loss": 1.411,
2048
+ "step": 1640
2049
+ },
2050
+ {
2051
+ "epoch": 9.61,
2052
+ "learning_rate": 7.60233918128655e-07,
2053
+ "loss": 1.3924,
2054
+ "step": 1645
2055
+ },
2056
+ {
2057
+ "epoch": 9.64,
2058
+ "learning_rate": 7.017543859649123e-07,
2059
+ "loss": 1.3624,
2060
+ "step": 1650
2061
+ },
2062
+ {
2063
+ "epoch": 9.67,
2064
+ "learning_rate": 6.432748538011696e-07,
2065
+ "loss": 1.3848,
2066
+ "step": 1655
2067
+ },
2068
+ {
2069
+ "epoch": 9.7,
2070
+ "learning_rate": 5.847953216374269e-07,
2071
+ "loss": 1.3912,
2072
+ "step": 1660
2073
+ },
2074
+ {
2075
+ "epoch": 9.73,
2076
+ "learning_rate": 5.263157894736843e-07,
2077
+ "loss": 1.4152,
2078
+ "step": 1665
2079
+ },
2080
+ {
2081
+ "epoch": 9.76,
2082
+ "learning_rate": 4.6783625730994155e-07,
2083
+ "loss": 1.3942,
2084
+ "step": 1670
2085
+ },
2086
+ {
2087
+ "epoch": 9.79,
2088
+ "learning_rate": 4.093567251461988e-07,
2089
+ "loss": 1.3855,
2090
+ "step": 1675
2091
+ },
2092
+ {
2093
+ "epoch": 9.82,
2094
+ "learning_rate": 3.5087719298245616e-07,
2095
+ "loss": 1.3804,
2096
+ "step": 1680
2097
+ },
2098
+ {
2099
+ "epoch": 9.85,
2100
+ "learning_rate": 2.9239766081871344e-07,
2101
+ "loss": 1.3904,
2102
+ "step": 1685
2103
+ },
2104
+ {
2105
+ "epoch": 9.88,
2106
+ "learning_rate": 2.3391812865497077e-07,
2107
+ "loss": 1.3886,
2108
+ "step": 1690
2109
+ },
2110
+ {
2111
+ "epoch": 9.91,
2112
+ "learning_rate": 1.7543859649122808e-07,
2113
+ "loss": 1.4217,
2114
+ "step": 1695
2115
+ },
2116
+ {
2117
+ "epoch": 9.93,
2118
+ "learning_rate": 1.1695906432748539e-07,
2119
+ "loss": 1.3938,
2120
+ "step": 1700
2121
+ },
2122
+ {
2123
+ "epoch": 9.96,
2124
+ "learning_rate": 5.847953216374269e-08,
2125
+ "loss": 1.3799,
2126
+ "step": 1705
2127
+ },
2128
+ {
2129
+ "epoch": 9.99,
2130
+ "learning_rate": 0.0,
2131
+ "loss": 1.3871,
2132
+ "step": 1710
2133
+ },
2134
+ {
2135
+ "epoch": 9.99,
2136
+ "eval_loss": 1.4218624830245972,
2137
+ "eval_runtime": 54.6025,
2138
+ "eval_samples_per_second": 96.791,
2139
+ "eval_steps_per_second": 3.04,
2140
+ "step": 1710
2141
+ },
2142
+ {
2143
+ "epoch": 9.99,
2144
+ "step": 1710,
2145
+ "total_flos": 1.862028915941376e+17,
2146
+ "train_loss": 1.5164859166619373,
2147
+ "train_runtime": 422176.3551,
2148
+ "train_samples_per_second": 2.075,
2149
+ "train_steps_per_second": 0.004
2150
+ }
2151
+ ],
2152
+ "max_steps": 1710,
2153
+ "num_train_epochs": 10,
2154
+ "total_flos": 1.862028915941376e+17,
2155
+ "trial_name": null,
2156
+ "trial_params": null
2157
+ }