Meta-Llama-3-8B_pct_reverse
This model is a fine-tuned version of unsloth/llama-3-8b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.1917
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.2547 | 0.0206 | 8 | 2.2652 |
2.2857 | 0.0412 | 16 | 2.2722 |
2.217 | 0.0618 | 24 | 2.2663 |
2.2942 | 0.0824 | 32 | 2.2549 |
2.281 | 0.1030 | 40 | 2.2508 |
2.2541 | 0.1236 | 48 | 2.2708 |
2.2672 | 0.1442 | 56 | 2.2648 |
2.2887 | 0.1648 | 64 | 2.2698 |
2.2464 | 0.1854 | 72 | 2.2654 |
2.2805 | 0.2060 | 80 | 2.2734 |
2.3111 | 0.2266 | 88 | 2.2742 |
2.361 | 0.2472 | 96 | 2.2808 |
2.3418 | 0.2678 | 104 | 2.2802 |
2.3064 | 0.2884 | 112 | 2.2952 |
2.3509 | 0.3090 | 120 | 2.2841 |
2.3507 | 0.3296 | 128 | 2.2786 |
2.3 | 0.3502 | 136 | 2.2801 |
2.2953 | 0.3708 | 144 | 2.2772 |
2.3224 | 0.3914 | 152 | 2.2823 |
2.3055 | 0.4120 | 160 | 2.2739 |
2.3519 | 0.4326 | 168 | 2.2795 |
2.2988 | 0.4532 | 176 | 2.2694 |
2.3046 | 0.4738 | 184 | 2.2648 |
2.296 | 0.4944 | 192 | 2.2661 |
2.2908 | 0.5150 | 200 | 2.2650 |
2.2923 | 0.5356 | 208 | 2.2633 |
2.3062 | 0.5562 | 216 | 2.2469 |
2.289 | 0.5768 | 224 | 2.2516 |
2.2736 | 0.5974 | 232 | 2.2452 |
2.2414 | 0.6180 | 240 | 2.2406 |
2.2667 | 0.6386 | 248 | 2.2355 |
2.2595 | 0.6592 | 256 | 2.2354 |
2.2175 | 0.6798 | 264 | 2.2276 |
2.277 | 0.7004 | 272 | 2.2221 |
2.2576 | 0.7210 | 280 | 2.2161 |
2.2604 | 0.7416 | 288 | 2.2123 |
2.2526 | 0.7621 | 296 | 2.2118 |
2.2838 | 0.7827 | 304 | 2.2033 |
2.2214 | 0.8033 | 312 | 2.2009 |
2.2034 | 0.8239 | 320 | 2.2015 |
2.235 | 0.8445 | 328 | 2.1954 |
2.2444 | 0.8651 | 336 | 2.1971 |
2.2593 | 0.8857 | 344 | 2.1939 |
2.2222 | 0.9063 | 352 | 2.1929 |
2.1894 | 0.9269 | 360 | 2.1944 |
2.2138 | 0.9475 | 368 | 2.1927 |
2.2543 | 0.9681 | 376 | 2.1918 |
2.2462 | 0.9887 | 384 | 2.1917 |
Framework versions
- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 4
Model tree for imdatta0/Meta-Llama-3-8B_pct_reverse
Base model
unsloth/llama-3-8b