mistral-lora-token-classification
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1492
- Precision: 0.5966
- Recall: 0.5541
- F1-score: 0.5686
- Accuracy: 0.5541
- wanb : Syncing run resilient-rain-13
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1-score | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 474 | 1.9985 | 0.3814 | 0.2424 | 0.2716 | 0.2424 |
3.272 | 2.0 | 948 | 1.7847 | 0.4187 | 0.2897 | 0.3251 | 0.2897 |
1.8653 | 3.0 | 1422 | 1.7270 | 0.4383 | 0.3032 | 0.3087 | 0.3032 |
1.6688 | 4.0 | 1896 | 1.5884 | 0.4382 | 0.4088 | 0.4190 | 0.4088 |
1.5773 | 5.0 | 2370 | 1.5324 | 0.4455 | 0.4291 | 0.4305 | 0.4291 |
1.5071 | 6.0 | 2844 | 1.4669 | 0.4717 | 0.4443 | 0.4527 | 0.4443 |
1.4485 | 7.0 | 3318 | 1.4577 | 0.4804 | 0.4527 | 0.4607 | 0.4527 |
1.3983 | 8.0 | 3792 | 1.4055 | 0.5104 | 0.3953 | 0.4235 | 0.3953 |
1.3515 | 9.0 | 4266 | 1.4217 | 0.4997 | 0.4831 | 0.4764 | 0.4831 |
1.302 | 10.0 | 4740 | 1.3502 | 0.5357 | 0.4789 | 0.4965 | 0.4789 |
1.3114 | 11.0 | 5214 | 1.3226 | 0.5321 | 0.5017 | 0.5143 | 0.5017 |
1.2243 | 12.0 | 5688 | 1.3426 | 0.5380 | 0.5034 | 0.5155 | 0.5034 |
1.2218 | 13.0 | 6162 | 1.3211 | 0.5436 | 0.4975 | 0.5111 | 0.4975 |
1.2021 | 14.0 | 6636 | 1.2606 | 0.5552 | 0.5186 | 0.5329 | 0.5186 |
1.196 | 15.0 | 7110 | 1.2437 | 0.5642 | 0.5034 | 0.5258 | 0.5034 |
1.1738 | 16.0 | 7584 | 1.2437 | 0.5679 | 0.5363 | 0.5460 | 0.5363 |
1.1511 | 17.0 | 8058 | 1.2798 | 0.5699 | 0.5017 | 0.5044 | 0.5017 |
1.1515 | 18.0 | 8532 | 1.2597 | 0.5717 | 0.5448 | 0.5411 | 0.5448 |
1.1265 | 19.0 | 9006 | 1.2373 | 0.5707 | 0.5355 | 0.5438 | 0.5355 |
1.1265 | 20.0 | 9480 | 1.2512 | 0.5880 | 0.5752 | 0.5752 | 0.5752 |
1.1253 | 21.0 | 9954 | 1.2344 | 0.5928 | 0.5051 | 0.5269 | 0.5051 |
1.0966 | 22.0 | 10428 | 1.2514 | 0.5884 | 0.5051 | 0.5256 | 0.5051 |
1.1011 | 23.0 | 10902 | 1.2126 | 0.5869 | 0.5574 | 0.5583 | 0.5574 |
1.061 | 24.0 | 11376 | 1.2364 | 0.6044 | 0.5372 | 0.5585 | 0.5372 |
1.0744 | 25.0 | 11850 | 1.1627 | 0.6052 | 0.5380 | 0.5576 | 0.5380 |
1.0366 | 26.0 | 12324 | 1.1630 | 0.5929 | 0.5667 | 0.5766 | 0.5667 |
1.0578 | 27.0 | 12798 | 1.1868 | 0.5858 | 0.5726 | 0.5749 | 0.5726 |
1.0552 | 28.0 | 13272 | 1.1689 | 0.6039 | 0.5465 | 0.5364 | 0.5465 |
1.0451 | 29.0 | 13746 | 1.1845 | 0.6083 | 0.5473 | 0.5578 | 0.5473 |
1.0296 | 30.0 | 14220 | 1.1492 | 0.5966 | 0.5541 | 0.5686 | 0.5541 |
Framework versions
- PEFT 0.10.0
- Transformers 4.39.3
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 1
Model tree for adhi29/mistral-lora-token-classification
Base model
mistralai/Mistral-7B-v0.1