Qwen2-7B_metamath_reverse
This model is a fine-tuned version of unsloth/Qwen2-7B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2136
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.1753 | 0.0211 | 13 | 0.1875 |
0.2036 | 0.0421 | 26 | 0.2524 |
0.2585 | 0.0632 | 39 | 0.2876 |
0.2848 | 0.0842 | 52 | 0.3146 |
0.2997 | 0.1053 | 65 | 0.3231 |
0.3196 | 0.1264 | 78 | 0.3350 |
0.3263 | 0.1474 | 91 | 0.3406 |
0.3148 | 0.1685 | 104 | 0.3401 |
0.3297 | 0.1896 | 117 | 0.3456 |
0.3221 | 0.2106 | 130 | 0.3477 |
0.3359 | 0.2317 | 143 | 0.3491 |
0.3296 | 0.2527 | 156 | 0.3399 |
0.3361 | 0.2738 | 169 | 0.3416 |
0.3187 | 0.2949 | 182 | 0.3376 |
0.3285 | 0.3159 | 195 | 0.3370 |
0.3189 | 0.3370 | 208 | 0.3306 |
0.3154 | 0.3580 | 221 | 0.3293 |
0.3149 | 0.3791 | 234 | 0.3263 |
0.3099 | 0.4002 | 247 | 0.3208 |
0.3089 | 0.4212 | 260 | 0.3143 |
0.3125 | 0.4423 | 273 | 0.3104 |
0.2959 | 0.4633 | 286 | 0.3061 |
0.3042 | 0.4844 | 299 | 0.2993 |
0.2829 | 0.5055 | 312 | 0.2940 |
0.2832 | 0.5265 | 325 | 0.2878 |
0.2715 | 0.5476 | 338 | 0.2821 |
0.2702 | 0.5687 | 351 | 0.2753 |
0.2687 | 0.5897 | 364 | 0.2687 |
0.2604 | 0.6108 | 377 | 0.2629 |
0.252 | 0.6318 | 390 | 0.2579 |
0.2537 | 0.6529 | 403 | 0.2529 |
0.2535 | 0.6740 | 416 | 0.2477 |
0.2442 | 0.6950 | 429 | 0.2425 |
0.2451 | 0.7161 | 442 | 0.2378 |
0.2275 | 0.7371 | 455 | 0.2338 |
0.2288 | 0.7582 | 468 | 0.2310 |
0.2323 | 0.7793 | 481 | 0.2294 |
0.2254 | 0.8003 | 494 | 0.2260 |
0.2142 | 0.8214 | 507 | 0.2221 |
0.219 | 0.8424 | 520 | 0.2195 |
0.2133 | 0.8635 | 533 | 0.2180 |
0.2095 | 0.8846 | 546 | 0.2164 |
0.2067 | 0.9056 | 559 | 0.2155 |
0.2073 | 0.9267 | 572 | 0.2146 |
0.2124 | 0.9478 | 585 | 0.2140 |
0.2115 | 0.9688 | 598 | 0.2138 |
0.2127 | 0.9899 | 611 | 0.2136 |
Framework versions
- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 0
Model tree for imdatta0/Qwen2-7B_metamath_reverse
Base model
unsloth/Qwen2-7B