--- license: apache-2.0 library_name: peft tags: - generated_from_trainer datasets: - medmnist-v2 metrics: - accuracy - precision - recall - f1 base_model: google/vit-base-patch16-224-in21k model-index: - name: organc-vit-base-finetuned results: [] --- # organc-vit-base-finetuned This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the medmnist-v2 dataset. It achieves the following results on the evaluation set: - Loss: 0.2248 - Accuracy: 0.9283 - Precision: 0.9231 - Recall: 0.9160 - F1: 0.9189 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.6907 | 1.0 | 203 | 0.2221 | 0.9202 | 0.9165 | 0.8691 | 0.8480 | | 0.5616 | 2.0 | 406 | 0.1278 | 0.9720 | 0.9657 | 0.9694 | 0.9666 | | 0.5515 | 3.0 | 609 | 0.1428 | 0.9649 | 0.9626 | 0.9640 | 0.9621 | | 0.4941 | 4.0 | 813 | 0.1016 | 0.9724 | 0.9683 | 0.9696 | 0.9683 | | 0.4764 | 5.0 | 1016 | 0.0998 | 0.9716 | 0.9654 | 0.9649 | 0.9637 | | 0.4599 | 6.0 | 1219 | 0.0941 | 0.9758 | 0.9775 | 0.9788 | 0.9778 | | 0.4525 | 7.0 | 1422 | 0.0861 | 0.9795 | 0.9812 | 0.9793 | 0.9800 | | 0.3835 | 8.0 | 1626 | 0.0788 | 0.9849 | 0.9846 | 0.9850 | 0.9847 | | 0.2767 | 9.0 | 1829 | 0.0935 | 0.9774 | 0.9805 | 0.9800 | 0.9798 | | 0.299 | 9.99 | 2030 | 0.0701 | 0.9854 | 0.9843 | 0.9864 | 0.9852 | ### Framework versions - PEFT 0.10.0 - Transformers 4.38.2 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2