|
--- |
|
base_model: mistral_rulm_unigram_init_20_10_23 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: mistral7b_darulm_unigram_1e_20_10_23 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# mistral7b_darulm_unigram_1e_20_10_23 |
|
|
|
This model is a fine-tuned version of [mistral_rulm_unigram_init_20_10_23](https://huggingface.co/mistral_rulm_unigram_init_20_10_23) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 2.7017 |
|
- Accuracy: 0.4706 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0003 |
|
- train_batch_size: 6 |
|
- eval_batch_size: 6 |
|
- seed: 42 |
|
- distributed_type: multi-GPU |
|
- num_devices: 10 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 240 |
|
- total_eval_batch_size: 60 |
|
- optimizer: Adam with betas=(0.9,0.95) and epsilon=1e-05 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 200 |
|
- num_epochs: 1.0 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:------:|:---------------:|:--------:| |
|
| 3.1631 | 0.01 | 1000 | 3.1804 | 0.4182 | |
|
| 3.1124 | 0.02 | 2000 | 3.1065 | 0.4272 | |
|
| 3.0757 | 0.03 | 3000 | 3.0894 | 0.4288 | |
|
| 3.0488 | 0.04 | 4000 | 3.0630 | 0.4319 | |
|
| 3.0403 | 0.05 | 5000 | 3.0423 | 0.4336 | |
|
| 3.0172 | 0.05 | 6000 | 3.0384 | 0.4343 | |
|
| 3.0102 | 0.06 | 7000 | 3.0267 | 0.4360 | |
|
| 2.9888 | 0.07 | 8000 | 3.0190 | 0.4361 | |
|
| 3.0024 | 0.08 | 9000 | 3.0040 | 0.4385 | |
|
| 2.9948 | 0.09 | 10000 | 3.0058 | 0.4378 | |
|
| 2.9774 | 0.1 | 11000 | 2.9962 | 0.4389 | |
|
| 2.9818 | 0.11 | 12000 | 2.9964 | 0.4390 | |
|
| 2.9771 | 0.12 | 13000 | 2.9913 | 0.4396 | |
|
| 2.9786 | 0.13 | 14000 | 2.9915 | 0.4391 | |
|
| 2.9866 | 0.14 | 15000 | 2.9924 | 0.4394 | |
|
| 2.9751 | 0.14 | 16000 | 2.9918 | 0.4389 | |
|
| 2.9702 | 0.15 | 17000 | 2.9926 | 0.4393 | |
|
| 2.9695 | 0.16 | 18000 | 2.9816 | 0.4401 | |
|
| 2.9615 | 0.17 | 19000 | 2.9826 | 0.4402 | |
|
| 2.9609 | 0.18 | 20000 | 2.9791 | 0.4406 | |
|
| 2.9607 | 0.19 | 21000 | 2.9684 | 0.4416 | |
|
| 2.9533 | 0.2 | 22000 | 2.9677 | 0.4422 | |
|
| 2.9513 | 0.21 | 23000 | 2.9676 | 0.4421 | |
|
| 2.9563 | 0.22 | 24000 | 2.9610 | 0.4429 | |
|
| 2.9466 | 0.23 | 25000 | 2.9627 | 0.4424 | |
|
| 2.9431 | 0.24 | 26000 | 2.9590 | 0.4424 | |
|
| 2.9412 | 0.24 | 27000 | 2.9525 | 0.4436 | |
|
| 2.9299 | 0.25 | 28000 | 2.9504 | 0.4435 | |
|
| 2.9332 | 0.26 | 29000 | 2.9486 | 0.4435 | |
|
| 2.9255 | 0.27 | 30000 | 2.9425 | 0.4442 | |
|
| 2.9242 | 0.28 | 31000 | 2.9459 | 0.4434 | |
|
| 2.9242 | 0.29 | 32000 | 2.9378 | 0.4445 | |
|
| 2.9267 | 0.3 | 33000 | 2.9316 | 0.4453 | |
|
| 2.9151 | 0.31 | 34000 | 2.9315 | 0.4454 | |
|
| 2.9105 | 0.32 | 35000 | 2.9286 | 0.4456 | |
|
| 2.9053 | 0.33 | 36000 | 2.9242 | 0.4457 | |
|
| 2.9023 | 0.33 | 37000 | 2.9195 | 0.4466 | |
|
| 2.8946 | 0.34 | 38000 | 2.9177 | 0.4468 | |
|
| 2.9037 | 0.35 | 39000 | 2.9147 | 0.4470 | |
|
| 2.8893 | 0.36 | 40000 | 2.9130 | 0.4468 | |
|
| 2.8891 | 0.37 | 41000 | 2.9055 | 0.4481 | |
|
| 2.8851 | 0.38 | 42000 | 2.9017 | 0.4485 | |
|
| 2.8909 | 0.39 | 43000 | 2.9011 | 0.4483 | |
|
| 2.896 | 0.4 | 44000 | 2.9061 | 0.4479 | |
|
| 2.8918 | 0.41 | 45000 | 2.9043 | 0.4479 | |
|
| 2.8847 | 0.42 | 46000 | 2.8954 | 0.4490 | |
|
| 2.8749 | 0.42 | 47000 | 2.8912 | 0.4494 | |
|
| 2.8832 | 0.43 | 48000 | 2.8912 | 0.4496 | |
|
| 2.8745 | 0.44 | 49000 | 2.8853 | 0.4500 | |
|
| 2.8717 | 0.45 | 50000 | 2.8834 | 0.4502 | |
|
| 2.8659 | 0.46 | 51000 | 2.8831 | 0.4503 | |
|
| 2.865 | 0.47 | 52000 | 2.8784 | 0.4505 | |
|
| 2.8575 | 0.48 | 53000 | 2.8763 | 0.4508 | |
|
| 2.8571 | 0.49 | 54000 | 2.8741 | 0.4513 | |
|
| 2.8554 | 0.5 | 55000 | 2.8704 | 0.4514 | |
|
| 2.8526 | 0.51 | 56000 | 2.8669 | 0.4519 | |
|
| 2.8521 | 0.52 | 57000 | 2.8618 | 0.4525 | |
|
| 2.8398 | 0.52 | 58000 | 2.8600 | 0.4522 | |
|
| 2.8398 | 0.53 | 59000 | 2.8576 | 0.4528 | |
|
| 2.837 | 0.54 | 60000 | 2.8536 | 0.4528 | |
|
| 2.837 | 0.55 | 61000 | 2.8519 | 0.4535 | |
|
| 2.8427 | 0.56 | 62000 | 2.8493 | 0.4536 | |
|
| 2.8365 | 0.57 | 63000 | 2.8468 | 0.4541 | |
|
| 2.8327 | 0.58 | 64000 | 2.8447 | 0.4539 | |
|
| 2.8289 | 0.59 | 65000 | 2.8388 | 0.4546 | |
|
| 2.8166 | 0.6 | 66000 | 2.8346 | 0.4547 | |
|
| 2.8171 | 0.61 | 67000 | 2.8294 | 0.4558 | |
|
| 2.8184 | 0.61 | 68000 | 2.8269 | 0.4556 | |
|
| 2.8102 | 0.62 | 69000 | 2.8243 | 0.4563 | |
|
| 2.8153 | 0.63 | 70000 | 2.8211 | 0.4564 | |
|
| 2.8035 | 0.64 | 71000 | 2.8185 | 0.4569 | |
|
| 2.8042 | 0.65 | 72000 | 2.8206 | 0.4569 | |
|
| 2.7984 | 0.66 | 73000 | 2.8138 | 0.4574 | |
|
| 2.7883 | 0.67 | 74000 | 2.8112 | 0.4574 | |
|
| 2.7962 | 0.68 | 75000 | 2.8056 | 0.4584 | |
|
| 2.7937 | 0.69 | 76000 | 2.8068 | 0.4582 | |
|
| 2.7853 | 0.7 | 77000 | 2.8011 | 0.4588 | |
|
| 2.7798 | 0.71 | 78000 | 2.7954 | 0.4597 | |
|
| 2.7851 | 0.71 | 79000 | 2.7913 | 0.4598 | |
|
| 2.7831 | 0.72 | 80000 | 2.7897 | 0.4600 | |
|
| 2.7773 | 0.73 | 81000 | 2.7862 | 0.4603 | |
|
| 2.7688 | 0.74 | 82000 | 2.7836 | 0.4609 | |
|
| 2.7658 | 0.75 | 83000 | 2.7798 | 0.4610 | |
|
| 2.7622 | 0.76 | 84000 | 2.7815 | 0.4612 | |
|
| 2.7691 | 0.77 | 85000 | 2.7783 | 0.4612 | |
|
| 2.7579 | 0.78 | 86000 | 2.7712 | 0.4619 | |
|
| 2.7614 | 0.79 | 87000 | 2.7673 | 0.4625 | |
|
| 2.7592 | 0.8 | 88000 | 2.7691 | 0.4623 | |
|
| 2.7551 | 0.8 | 89000 | 2.7607 | 0.4634 | |
|
| 2.7397 | 0.81 | 90000 | 2.7579 | 0.4637 | |
|
| 2.7357 | 0.82 | 91000 | 2.7580 | 0.4636 | |
|
| 2.7452 | 0.83 | 92000 | 2.7517 | 0.4643 | |
|
| 2.7418 | 0.84 | 93000 | 2.7533 | 0.4641 | |
|
| 2.7379 | 0.85 | 94000 | 2.7481 | 0.4647 | |
|
| 2.7308 | 0.86 | 95000 | 2.7460 | 0.4654 | |
|
| 2.727 | 0.87 | 96000 | 2.7408 | 0.4655 | |
|
| 2.7282 | 0.88 | 97000 | 2.7351 | 0.4664 | |
|
| 2.7133 | 0.89 | 98000 | 2.7301 | 0.4669 | |
|
| 2.7136 | 0.9 | 99000 | 2.7251 | 0.4673 | |
|
| 2.7108 | 0.9 | 100000 | 2.7208 | 0.4679 | |
|
| 2.7051 | 0.91 | 101000 | 2.7192 | 0.4681 | |
|
| 2.7013 | 0.92 | 102000 | 2.7151 | 0.4687 | |
|
| 2.6996 | 0.93 | 103000 | 2.7129 | 0.4689 | |
|
| 2.6898 | 0.94 | 104000 | 2.7084 | 0.4694 | |
|
| 2.688 | 0.95 | 105000 | 2.7053 | 0.4697 | |
|
| 2.6855 | 0.96 | 106000 | 2.7018 | 0.4701 | |
|
| 2.6852 | 0.97 | 107000 | 2.6989 | 0.4705 | |
|
| 2.689 | 0.98 | 108000 | 2.6982 | 0.4705 | |
|
| 2.6868 | 0.99 | 109000 | 2.6994 | 0.4707 | |
|
| 2.6901 | 0.99 | 110000 | 2.7006 | 0.4707 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.34.0 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.14.5 |
|
- Tokenizers 0.14.1 |
|
|