--- license: other base_model: yahma/llama-7b-hf tags: - generated_from_trainer model-index: - name: V0305P6 results: [] --- # V0305P6 This model is a fine-tuned version of [yahma/llama-7b-hf](https://huggingface.co/yahma/llama-7b-hf) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0736 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 32 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_steps: 20 - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.7001 | 0.09 | 10 | 0.5244 | | 0.2134 | 0.17 | 20 | 0.1572 | | 0.1574 | 0.26 | 30 | 0.1549 | | 0.1522 | 0.34 | 40 | 0.1488 | | 0.1501 | 0.43 | 50 | 0.1488 | | 0.1553 | 0.51 | 60 | 0.1484 | | 0.1482 | 0.6 | 70 | 0.1376 | | 0.144 | 0.68 | 80 | 0.1298 | | 0.131 | 0.77 | 90 | 0.1147 | | 0.1268 | 0.85 | 100 | 0.1112 | | 0.1196 | 0.94 | 110 | 0.0988 | | 0.115 | 1.02 | 120 | 0.1008 | | 0.1083 | 1.11 | 130 | 0.0982 | | 0.102 | 1.19 | 140 | 0.0943 | | 0.0984 | 1.28 | 150 | 0.0875 | | 0.0964 | 1.37 | 160 | 0.0853 | | 0.0953 | 1.45 | 170 | 0.0855 | | 0.0888 | 1.54 | 180 | 0.0825 | | 0.089 | 1.62 | 190 | 0.0839 | | 0.0955 | 1.71 | 200 | 0.0811 | | 0.094 | 1.79 | 210 | 0.0784 | | 0.0901 | 1.88 | 220 | 0.0729 | | 0.0856 | 1.96 | 230 | 0.0771 | | 0.0717 | 2.05 | 240 | 0.0744 | | 0.0648 | 2.13 | 250 | 0.0730 | | 0.061 | 2.22 | 260 | 0.0720 | | 0.0589 | 2.3 | 270 | 0.0759 | | 0.0664 | 2.39 | 280 | 0.0702 | | 0.0676 | 2.47 | 290 | 0.0693 | | 0.0636 | 2.56 | 300 | 0.0699 | | 0.0667 | 2.65 | 310 | 0.0711 | | 0.0585 | 2.73 | 320 | 0.0726 | | 0.0619 | 2.82 | 330 | 0.0732 | | 0.0613 | 2.9 | 340 | 0.0735 | | 0.0611 | 2.99 | 350 | 0.0736 | ### Framework versions - Transformers 4.36.0.dev0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.14.1