CalamitousFelicitousness commited on
Commit
a02313a
1 Parent(s): 8d9c87d

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +487 -0
README.md ADDED
@@ -0,0 +1,487 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: other
4
+ license_name: qwen
5
+ license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
6
+ base_model: Qwen/Qwen2.5-72B
7
+ datasets:
8
+ - anthracite-org/kalo-opus-instruct-22k-no-refusal
9
+ - Nopm/Opus_WritingStruct
10
+ - Gryphe/Sonnet3.5-SlimOrcaDedupCleaned
11
+ - Gryphe/Sonnet3.5-Charcard-Roleplay
12
+ - Gryphe/ChatGPT-4o-Writing-Prompts
13
+ - Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
14
+ - Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
15
+ - nothingiisreal/Reddit-Dirty-And-WritingPrompts
16
+ - allura-org/Celeste-1.x-data-mixture
17
+ tags:
18
+ - generated_from_trainer
19
+ model-index:
20
+ - name: EVA-Qwen2.5-72B-SFFT-v0.0
21
+ results: []
22
+ ---
23
+
24
+ # This repo contains the copy of the original quantized to FP8. Original: [EVA-UNIT-01/EVA-Qwen2.5-72B-v0.0](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.0)
25
+
26
+ # EVA Qwen2.5-72B v0.0
27
+
28
+ <p>
29
+ A RP/storywriting specialist model, full-parameter finetune of Qwen2.5-72B on mixture of synthetic and natural data.<br>
30
+ It uses Celeste 70B 0.1 data mixture, greatly expanding it to improve versatility, creativity and "flavor" of the resulting model.<br>
31
+ </p>
32
+
33
+ <p>Model is available for inference on <a href=https://featherless.ai/models/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.0>Featherless.AI</a></p
34
+
35
+ <p>Note: using quantized KV cache with Qwen2.5 <b>is not recommended</b> and can lead to degraded output quality. On the other hand, Qwen's KV cache is already light enough, so using f16 for it shouldn't be problematic.</p>
36
+ <p>Note #2: due to some unexpected effects of data normalization, some artifacting in form of randomly appearring sequence of <code>—</code> can appear in outputs sometimes, if penalties are too high. To avoid it, ban token number <code>158</code>. Thanks to Cahvay/ALK for discovering this fix!</p>
37
+ <p>
38
+ <p>Prompt format is ChatML.</p><br>
39
+ <h3>Recommended sampler values:</h3>
40
+ <ul>
41
+ <li>Temperature: 1</li>
42
+ <li>Typical-P: 0.9</li>
43
+ <li>Min-P: 0.05</li>
44
+ <li>Top-A: 0.2</li>
45
+ <li>Repetition Penalty: 1.03</li>
46
+ </ul>
47
+
48
+ <h3>Recommended SillyTavern presets (via CalamitousFelicitousness):</h3>
49
+
50
+ - [Context](https://huggingface.co/EVA-UNIT-01/EVA-Yi-1.5-9B-32K-V1/blob/main/%5BChatML%5D%20Roleplay-v1.9%20Context.json)
51
+ - [Instruct and System Prompt](https://huggingface.co/EVA-UNIT-01/EVA-Yi-1.5-9B-32K-V1/blob/main/%5BChatML%5D%20Roleplay-v1.9%20Instruct.json)
52
+ </p>
53
+
54
+
55
+ <p>
56
+ <br>
57
+ <h3>
58
+ Training data:
59
+ </h3>
60
+ <ul>
61
+ <li>Celeste 70B 0.1 data mixture minus Opus Instruct subset. See that model's <a href=https://huggingface.co/nothingiisreal/L3.1-70B-Celeste-V0.1-BF16>card</a> for details.</li>
62
+ <li>Kalomaze's Opus_Instruct_25k dataset, filtered for refusals.</li>
63
+ <li>A subset (1k rows) of ChatGPT-4o-WritingPrompts by Gryphe</li>
64
+ <li>A subset (2k rows) of Sonnet3.5-Charcards-Roleplay by Gryphe</li>
65
+ <li>Synthstruct and SynthRP datasets by Epiculous</li>
66
+ </ul>
67
+ <h3>
68
+ Training time and hardware:
69
+ </h3>
70
+ <ul><li>12 hours on 8xMI300X</li></ul><br>
71
+ </p>
72
+ <p>Model was trained by Kearm and Auri.</p>
73
+ <h4>Special thanks:</h4><ul>
74
+ <li>to Gryphe, Lemmy, Kalomaze, Nopm and Epiculous for the data</li>
75
+ <li>to CalamitiousFelicitousness for providing free inference for public beta testing</li>
76
+ <li>and to Allura-org for support and feedback on EVA models.</li></ul>
77
+ <a href=https://github.com/axolotl-ai-cloud/axolotl><img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/></a>
78
+ <details><summary>See axolotl config</summary>
79
+
80
+ axolotl version: `0.4.1`
81
+ ```yaml
82
+ base_model: Qwen/Qwen2.5-72B
83
+
84
+ load_in_8bit: false
85
+ load_in_4bit: false
86
+ strict: false
87
+
88
+ plugins:
89
+ - axolotl.integrations.liger.LigerPlugin
90
+ liger_rope: true
91
+ liger_rms_norm: true
92
+ liger_swiglu: false
93
+ liger_fused_linear_cross_entropy: false
94
+
95
+ # plugins:
96
+ # - axolotl.integrations.spectrum.SpectrumPlugin
97
+
98
+ # spectrum_top_fraction: 0.5
99
+ # # Optional if using a pre-scanned model as your base_model. Useful if using a model mirror
100
+ # spectrum_model_name: Qwen/Qwen2.5-32B
101
+
102
+ datasets:
103
+ - path: datasets/deduped_Synthstruct-Gens_processed_sharegpt_converted_cleaned.jsonl
104
+ type: sharegpt
105
+ - path: datasets/opus-instruct-22k-no_refusals-filtered.jsonl
106
+ type: sharegpt
107
+ - path: datasets/Celeste_Filtered.jsonl
108
+ type: sharegpt
109
+ - path: datasets/Gryphe-S3-5-Charcards-names-2k.jsonl
110
+ type: sharegpt
111
+ - path: datasets/deduped_SynthRP-Gens_processed_09-25-2024-ShareGPT_converted_cleaned.jsonl
112
+ type: sharegpt
113
+ - path: datasets/deduped_Gryphe-4o-WP-1k.jsonl
114
+ type: sharegpt
115
+ - path: datasets/deduped_not_samantha_norefusals.jsonl
116
+ type: sharegpt
117
+
118
+ chat_template: chatml
119
+ shuffle_merged_datasets: true
120
+ val_set_size: 0.001
121
+ output_dir: ./EVA-Qwen2.5-72B-SFFT-v0.0
122
+
123
+ sequence_len: 8192
124
+ sample_packing: true
125
+ eval_sample_packing: false
126
+ pad_to_sequence_len: true
127
+
128
+ # adapter: qlora
129
+ # lora_model_dir:
130
+ # lora_r: 64
131
+ # lora_alpha: 128
132
+ # lora_dropout: 0.05
133
+ # lora_target_linear: true
134
+ # peft_use_dora: true
135
+
136
+ unfrozen_parameters:
137
+ - ^lm_head.weight$
138
+ - ^model.embed_tokens.weight$
139
+ # mlp.down_proj layers
140
+ - model.layers.62.mlp.down_proj
141
+ - model.layers.64.mlp.down_proj
142
+ - model.layers.63.mlp.down_proj
143
+ - model.layers.66.mlp.down_proj
144
+ - model.layers.65.mlp.down_proj
145
+ - model.layers.67.mlp.down_proj
146
+ - model.layers.68.mlp.down_proj
147
+ - model.layers.31.mlp.down_proj
148
+ - model.layers.60.mlp.down_proj
149
+ - model.layers.69.mlp.down_proj
150
+ - model.layers.61.mlp.down_proj
151
+ - model.layers.59.mlp.down_proj
152
+ - model.layers.30.mlp.down_proj
153
+ - model.layers.70.mlp.down_proj
154
+ - model.layers.32.mlp.down_proj
155
+ - model.layers.34.mlp.down_proj
156
+ - model.layers.33.mlp.down_proj
157
+ - model.layers.76.mlp.down_proj
158
+ - model.layers.72.mlp.down_proj
159
+ - model.layers.71.mlp.down_proj
160
+ - model.layers.58.mlp.down_proj
161
+ - model.layers.75.mlp.down_proj
162
+ - model.layers.29.mlp.down_proj
163
+ - model.layers.56.mlp.down_proj
164
+ - model.layers.26.mlp.down_proj
165
+ - model.layers.35.mlp.down_proj
166
+ - model.layers.28.mlp.down_proj
167
+ - model.layers.57.mlp.down_proj
168
+ - model.layers.77.mlp.down_proj
169
+ - model.layers.36.mlp.down_proj
170
+ - model.layers.27.mlp.down_proj
171
+ - model.layers.25.mlp.down_proj
172
+ - model.layers.78.mlp.down_proj
173
+ - model.layers.37.mlp.down_proj
174
+ - model.layers.73.mlp.down_proj
175
+ - model.layers.55.mlp.down_proj
176
+ - model.layers.54.mlp.down_proj
177
+ - model.layers.74.mlp.down_proj
178
+ - model.layers.24.mlp.down_proj
179
+ - model.layers.53.mlp.down_proj
180
+ # mlp.gate_proj layers
181
+ - model.layers.78.mlp.gate_proj
182
+ - model.layers.77.mlp.gate_proj
183
+ - model.layers.76.mlp.gate_proj
184
+ - model.layers.79.mlp.gate_proj
185
+ - model.layers.75.mlp.gate_proj
186
+ - model.layers.74.mlp.gate_proj
187
+ - model.layers.73.mlp.gate_proj
188
+ - model.layers.72.mlp.gate_proj
189
+ - model.layers.71.mlp.gate_proj
190
+ - model.layers.70.mlp.gate_proj
191
+ - model.layers.69.mlp.gate_proj
192
+ - model.layers.57.mlp.gate_proj
193
+ - model.layers.54.mlp.gate_proj
194
+ - model.layers.55.mlp.gate_proj
195
+ - model.layers.68.mlp.gate_proj
196
+ - model.layers.63.mlp.gate_proj
197
+ - model.layers.53.mlp.gate_proj
198
+ - model.layers.44.mlp.gate_proj
199
+ - model.layers.45.mlp.gate_proj
200
+ - model.layers.49.mlp.gate_proj
201
+ - model.layers.58.mlp.gate_proj
202
+ - model.layers.46.mlp.gate_proj
203
+ - model.layers.56.mlp.gate_proj
204
+ - model.layers.67.mlp.gate_proj
205
+ - model.layers.62.mlp.gate_proj
206
+ - model.layers.50.mlp.gate_proj
207
+ - model.layers.64.mlp.gate_proj
208
+ - model.layers.52.mlp.gate_proj
209
+ - model.layers.40.mlp.gate_proj
210
+ - model.layers.43.mlp.gate_proj
211
+ - model.layers.48.mlp.gate_proj
212
+ - model.layers.66.mlp.gate_proj
213
+ - model.layers.47.mlp.gate_proj
214
+ - model.layers.59.mlp.gate_proj
215
+ - model.layers.65.mlp.gate_proj
216
+ - model.layers.61.mlp.gate_proj
217
+ - model.layers.60.mlp.gate_proj
218
+ - model.layers.42.mlp.gate_proj
219
+ - model.layers.51.mlp.gate_proj
220
+ - model.layers.41.mlp.gate_proj
221
+ # mlp.up_proj layers
222
+ - model.layers.70.mlp.up_proj
223
+ - model.layers.69.mlp.up_proj
224
+ - model.layers.71.mlp.up_proj
225
+ - model.layers.68.mlp.up_proj
226
+ - model.layers.72.mlp.up_proj
227
+ - model.layers.67.mlp.up_proj
228
+ - model.layers.66.mlp.up_proj
229
+ - model.layers.73.mlp.up_proj
230
+ - model.layers.46.mlp.up_proj
231
+ - model.layers.63.mlp.up_proj
232
+ - model.layers.75.mlp.up_proj
233
+ - model.layers.76.mlp.up_proj
234
+ - model.layers.74.mlp.up_proj
235
+ - model.layers.45.mlp.up_proj
236
+ - model.layers.62.mlp.up_proj
237
+ - model.layers.64.mlp.up_proj
238
+ - model.layers.65.mlp.up_proj
239
+ - model.layers.44.mlp.up_proj
240
+ - model.layers.53.mlp.up_proj
241
+ - model.layers.47.mlp.up_proj
242
+ - model.layers.49.mlp.up_proj
243
+ - model.layers.48.mlp.up_proj
244
+ - model.layers.57.mlp.up_proj
245
+ - model.layers.43.mlp.up_proj
246
+ - model.layers.42.mlp.up_proj
247
+ - model.layers.56.mlp.up_proj
248
+ - model.layers.61.mlp.up_proj
249
+ - model.layers.54.mlp.up_proj
250
+ - model.layers.40.mlp.up_proj
251
+ - model.layers.55.mlp.up_proj
252
+ - model.layers.77.mlp.up_proj
253
+ - model.layers.60.mlp.up_proj
254
+ - model.layers.41.mlp.up_proj
255
+ - model.layers.35.mlp.up_proj
256
+ - model.layers.37.mlp.up_proj
257
+ - model.layers.58.mlp.up_proj
258
+ - model.layers.34.mlp.up_proj
259
+ - model.layers.38.mlp.up_proj
260
+ - model.layers.33.mlp.up_proj
261
+ - model.layers.39.mlp.up_proj
262
+ # self_attn.k_proj layers
263
+ - model.layers.36.self_attn.k_proj
264
+ - model.layers.79.self_attn.k_proj
265
+ - model.layers.35.self_attn.k_proj
266
+ - model.layers.34.self_attn.k_proj
267
+ - model.layers.37.self_attn.k_proj
268
+ - model.layers.33.self_attn.k_proj
269
+ - model.layers.38.self_attn.k_proj
270
+ - model.layers.39.self_attn.k_proj
271
+ - model.layers.74.self_attn.k_proj
272
+ - model.layers.77.self_attn.k_proj
273
+ - model.layers.41.self_attn.k_proj
274
+ - model.layers.69.self_attn.k_proj
275
+ - model.layers.32.self_attn.k_proj
276
+ - model.layers.78.self_attn.k_proj
277
+ - model.layers.30.self_attn.k_proj
278
+ - model.layers.70.self_attn.k_proj
279
+ - model.layers.25.self_attn.k_proj
280
+ - model.layers.42.self_attn.k_proj
281
+ - model.layers.29.self_attn.k_proj
282
+ - model.layers.31.self_attn.k_proj
283
+ - model.layers.68.self_attn.k_proj
284
+ - model.layers.66.self_attn.k_proj
285
+ - model.layers.22.self_attn.k_proj
286
+ - model.layers.65.self_attn.k_proj
287
+ - model.layers.44.self_attn.k_proj
288
+ - model.layers.40.self_attn.k_proj
289
+ - model.layers.63.self_attn.k_proj
290
+ - model.layers.23.self_attn.k_proj
291
+ - model.layers.28.self_attn.k_proj
292
+ - model.layers.24.self_attn.k_proj
293
+ - model.layers.26.self_attn.k_proj
294
+ - model.layers.67.self_attn.k_proj
295
+ - model.layers.75.self_attn.k_proj
296
+ - model.layers.27.self_attn.k_proj
297
+ - model.layers.57.self_attn.k_proj
298
+ - model.layers.64.self_attn.k_proj
299
+ - model.layers.71.self_attn.k_proj
300
+ - model.layers.61.self_attn.k_proj
301
+ - model.layers.72.self_attn.k_proj
302
+ - model.layers.73.self_attn.k_proj
303
+ # self_attn.o_proj layers
304
+ - model.layers.69.self_attn.o_proj
305
+ - model.layers.39.self_attn.o_proj
306
+ - model.layers.16.self_attn.o_proj
307
+ - model.layers.14.self_attn.o_proj
308
+ - model.layers.19.self_attn.o_proj
309
+ - model.layers.42.self_attn.o_proj
310
+ - model.layers.12.self_attn.o_proj
311
+ - model.layers.15.self_attn.o_proj
312
+ - model.layers.17.self_attn.o_proj
313
+ - model.layers.38.self_attn.o_proj
314
+ - model.layers.23.self_attn.o_proj
315
+ - model.layers.22.self_attn.o_proj
316
+ - model.layers.13.self_attn.o_proj
317
+ - model.layers.29.self_attn.o_proj
318
+ - model.layers.41.self_attn.o_proj
319
+ - model.layers.44.self_attn.o_proj
320
+ - model.layers.46.self_attn.o_proj
321
+ - model.layers.45.self_attn.o_proj
322
+ - model.layers.43.self_attn.o_proj
323
+ - model.layers.49.self_attn.o_proj
324
+ - model.layers.30.self_attn.o_proj
325
+ - model.layers.26.self_attn.o_proj
326
+ - model.layers.25.self_attn.o_proj
327
+ - model.layers.37.self_attn.o_proj
328
+ - model.layers.47.self_attn.o_proj
329
+ - model.layers.11.self_attn.o_proj
330
+ - model.layers.18.self_attn.o_proj
331
+ - model.layers.28.self_attn.o_proj
332
+ - model.layers.20.self_attn.o_proj
333
+ - model.layers.27.self_attn.o_proj
334
+ - model.layers.53.self_attn.o_proj
335
+ - model.layers.52.self_attn.o_proj
336
+ - model.layers.35.self_attn.o_proj
337
+ - model.layers.71.self_attn.o_proj
338
+ - model.layers.10.self_attn.o_proj
339
+ - model.layers.3.self_attn.o_proj
340
+ - model.layers.21.self_attn.o_proj
341
+ - model.layers.24.self_attn.o_proj
342
+ - model.layers.68.self_attn.o_proj
343
+ - model.layers.48.self_attn.o_proj
344
+ # self_attn.q_proj layers
345
+ - model.layers.1.self_attn.q_proj
346
+ - model.layers.2.self_attn.q_proj
347
+ - model.layers.3.self_attn.q_proj
348
+ - model.layers.0.self_attn.q_proj
349
+ - model.layers.5.self_attn.q_proj
350
+ - model.layers.4.self_attn.q_proj
351
+ - model.layers.6.self_attn.q_proj
352
+ - model.layers.8.self_attn.q_proj
353
+ - model.layers.7.self_attn.q_proj
354
+ - model.layers.9.self_attn.q_proj
355
+ - model.layers.10.self_attn.q_proj
356
+ - model.layers.68.self_attn.q_proj
357
+ - model.layers.25.self_attn.q_proj
358
+ - model.layers.12.self_attn.q_proj
359
+ - model.layers.54.self_attn.q_proj
360
+ - model.layers.55.self_attn.q_proj
361
+ - model.layers.61.self_attn.q_proj
362
+ - model.layers.18.self_attn.q_proj
363
+ - model.layers.49.self_attn.q_proj
364
+ - model.layers.66.self_attn.q_proj
365
+ - model.layers.72.self_attn.q_proj
366
+ - model.layers.11.self_attn.q_proj
367
+ - model.layers.52.self_attn.q_proj
368
+ - model.layers.64.self_attn.q_proj
369
+ - model.layers.15.self_attn.q_proj
370
+ - model.layers.60.self_attn.q_proj
371
+ - model.layers.50.self_attn.q_proj
372
+ - model.layers.59.self_attn.q_proj
373
+ - model.layers.53.self_attn.q_proj
374
+ - model.layers.48.self_attn.q_proj
375
+ - model.layers.57.self_attn.q_proj
376
+ - model.layers.70.self_attn.q_proj
377
+ - model.layers.17.self_attn.q_proj
378
+ - model.layers.67.self_attn.q_proj
379
+ - model.layers.71.self_attn.q_proj
380
+ - model.layers.62.self_attn.q_proj
381
+ - model.layers.51.self_attn.q_proj
382
+ - model.layers.19.self_attn.q_proj
383
+ - model.layers.58.self_attn.q_proj
384
+ - model.layers.13.self_attn.q_proj
385
+ # self_attn.v_proj layers
386
+ - model.layers.23.self_attn.v_proj
387
+ - model.layers.25.self_attn.v_proj
388
+ - model.layers.26.self_attn.v_proj
389
+ - model.layers.27.self_attn.v_proj
390
+ - model.layers.28.self_attn.v_proj
391
+ - model.layers.29.self_attn.v_proj
392
+ - model.layers.30.self_attn.v_proj
393
+ - model.layers.31.self_attn.v_proj
394
+ - model.layers.34.self_attn.v_proj
395
+ - model.layers.35.self_attn.v_proj
396
+ - model.layers.36.self_attn.v_proj
397
+ - model.layers.37.self_attn.v_proj
398
+ - model.layers.38.self_attn.v_proj
399
+ - model.layers.42.self_attn.v_proj
400
+ - model.layers.48.self_attn.v_proj
401
+ - model.layers.57.self_attn.v_proj
402
+ - model.layers.58.self_attn.v_proj
403
+ - model.layers.61.self_attn.v_proj
404
+ - model.layers.63.self_attn.v_proj
405
+ - model.layers.64.self_attn.v_proj
406
+ - model.layers.65.self_attn.v_proj
407
+ - model.layers.66.self_attn.v_proj
408
+ - model.layers.69.self_attn.v_proj
409
+ - model.layers.70.self_attn.v_proj
410
+ - model.layers.74.self_attn.v_proj
411
+ - model.layers.75.self_attn.v_proj
412
+ - model.layers.72.self_attn.v_proj
413
+ - model.layers.39.self_attn.v_proj
414
+ - model.layers.41.self_attn.v_proj
415
+ - model.layers.40.self_attn.v_proj
416
+ - model.layers.33.self_attn.v_proj
417
+ - model.layers.59.self_attn.v_proj
418
+ - model.layers.16.self_attn.v_proj
419
+ - model.layers.15.self_attn.v_proj
420
+ - model.layers.76.self_attn.v_proj
421
+ - model.layers.24.self_attn.v_proj
422
+ - model.layers.68.self_attn.v_proj
423
+ - model.layers.67.self_attn.v_proj
424
+ - model.layers.55.self_attn.v_proj
425
+ - model.layers.44.self_attn.v_proj
426
+
427
+
428
+ wandb_project: EVA-Qwen2.5-72B-SFFT-v0.0
429
+ wandb_entity:
430
+ wandb_watch:
431
+ wandb_name: Unit-00
432
+ wandb_log_model:
433
+
434
+ gradient_accumulation_steps: 4
435
+ micro_batch_size: 4
436
+ num_epochs: 3
437
+ optimizer: paged_adamw_8bit
438
+ lr_scheduler: cosine
439
+ learning_rate: 0.00005
440
+ max_grad_norm: 3
441
+
442
+ train_on_inputs: false
443
+ group_by_length: false
444
+ bf16: auto
445
+ fp16:
446
+ tf32: false
447
+
448
+ gradient_checkpointing: "unsloth"
449
+ # gradient_checkpointing_kwargs:
450
+ # use_reentrant: true
451
+ early_stopping_patience:
452
+ resume_from_checkpoint:
453
+ local_rank:
454
+ logging_steps: 1
455
+ xformers_attention:
456
+ flash_attention: true
457
+
458
+ warmup_steps: 20
459
+ evals_per_epoch: 4
460
+ saves_per_epoch: 2
461
+ save_total_limit: 1
462
+ save_safetensors: true
463
+ hub_model_id:
464
+ hub_strategy:
465
+ debug:
466
+ deepspeed: deepspeed_configs/zero3_bf16.json
467
+ weight_decay: 0.1
468
+ # fsdp:
469
+ # - full_shard
470
+ # - auto_wrap
471
+ # fsdp_config:
472
+ # fsdp_limit_all_gathers: true
473
+ # fsdp_sync_module_states: false
474
+ # fsdp_offload_params: true
475
+ # fsdp_cpu_ram_efficient_loading: true
476
+ # fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
477
+ # fsdp_transformer_layer_cls_to_wrap: Qwen2DecoderLayer
478
+ # fsdp_activation_checkpointing: true
479
+ # fsdp_state_dict_type: SHARDED_STATE_DICT # Changed from FULL_STATE_DICT
480
+ # fsdp_sharding_strategy: FULL_SHARD
481
+ # fsdp_forward_prefetch: false # Added
482
+ # fsdp_backward_prefetch: "BACKWARD_PRE" # Added
483
+ # fsdp_backward_prefetch_limit: 1 # Added
484
+ # fsdp_mixed_precision: BF16 # Added
485
+ ```
486
+
487
+ </details><br>