--- license: apache-2.0 library_name: peft tags: - generated_from_trainer base_model: mistralai/Mistral-7B-v0.1 model-index: - name: org_modelorg_model results: [] --- # org_modelorg_model This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8643 - F1 Micro: 0.6606 - F1 Macro: 0.6479 - F1 Weighted: 0.6611 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 400 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | F1 Weighted | |:-------------:|:------:|:----:|:---------------:|:--------:|:--------:|:-----------:| | 1.6117 | 0.0154 | 25 | 1.3679 | 0.5541 | 0.4980 | 0.5313 | | 1.2903 | 0.0308 | 50 | 1.2094 | 0.6156 | 0.5779 | 0.6029 | | 1.164 | 0.0462 | 75 | 1.0987 | 0.6206 | 0.5931 | 0.6141 | | 1.1168 | 0.0615 | 100 | 1.1057 | 0.6376 | 0.5883 | 0.6165 | | 1.026 | 0.0769 | 125 | 0.9896 | 0.6314 | 0.6196 | 0.6328 | | 0.9481 | 0.0923 | 150 | 0.9619 | 0.6438 | 0.6173 | 0.6373 | | 0.9797 | 0.1077 | 175 | 0.9549 | 0.6514 | 0.6191 | 0.6411 | | 1.045 | 0.1231 | 200 | 0.9121 | 0.6541 | 0.6403 | 0.6543 | | 0.8954 | 0.1385 | 225 | 0.8991 | 0.6595 | 0.6418 | 0.6576 | | 0.9245 | 0.1538 | 250 | 0.8887 | 0.6588 | 0.6433 | 0.6580 | | 0.8636 | 0.1692 | 275 | 0.8824 | 0.6602 | 0.6458 | 0.6600 | | 0.846 | 0.1846 | 300 | 0.8793 | 0.6672 | 0.6451 | 0.6627 | | 0.8885 | 0.2 | 325 | 0.8820 | 0.6696 | 0.6431 | 0.6624 | | 0.8323 | 0.2154 | 350 | 0.8652 | 0.6618 | 0.6474 | 0.6616 | | 0.9313 | 0.2308 | 375 | 0.8654 | 0.6601 | 0.6477 | 0.6608 | | 0.857 | 0.2462 | 400 | 0.8643 | 0.6606 | 0.6479 | 0.6611 | ### Framework versions - PEFT 0.10.0 - Transformers 4.40.2 - Pytorch 2.3.0+cu118 - Datasets 2.19.0 - Tokenizers 0.19.1