--- library_name: transformers license: mit base_model: dbmdz/bert-base-turkish-cased tags: - generated_from_trainer metrics: - accuracy - f1 - precision - recall model-index: - name: turkish-zeroshot results: [] --- # turkish-zeroshot This model is a fine-tuned version of [dbmdz/bert-base-turkish-cased](https://huggingface.co/dbmdz/bert-base-turkish-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6808 - Accuracy: 0.7671 - F1: 0.7677 - Precision: 0.7764 - Recall: 0.7671 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 500 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:------:|:-----:|:---------------:|:--------:|:------:|:---------:|:------:| | 1.09 | 0.0326 | 200 | 1.0950 | 0.3759 | 0.3534 | 0.3966 | 0.3759 | | 0.9377 | 0.0652 | 400 | 0.8817 | 0.6092 | 0.6059 | 0.6499 | 0.6092 | | 0.8277 | 0.0978 | 600 | 0.7518 | 0.6799 | 0.6801 | 0.6904 | 0.6799 | | 0.7771 | 0.1304 | 800 | 0.7274 | 0.6984 | 0.6991 | 0.7138 | 0.6984 | | 0.7698 | 0.1630 | 1000 | 0.6928 | 0.7 | 0.7015 | 0.7111 | 0.7 | | 0.7619 | 0.1956 | 1200 | 0.6820 | 0.7161 | 0.7166 | 0.7313 | 0.7161 | | 0.7453 | 0.2282 | 1400 | 0.6614 | 0.7205 | 0.7217 | 0.7307 | 0.7205 | | 0.7287 | 0.2608 | 1600 | 0.6589 | 0.7209 | 0.7204 | 0.7346 | 0.7209 | | 0.7168 | 0.2934 | 1800 | 0.6694 | 0.7157 | 0.7157 | 0.7311 | 0.7157 | | 0.6923 | 0.3259 | 2000 | 0.6655 | 0.7165 | 0.7167 | 0.7312 | 0.7165 | | 0.7348 | 0.3585 | 2200 | 0.6594 | 0.7221 | 0.7207 | 0.7366 | 0.7221 | | 0.7022 | 0.3911 | 2400 | 0.6757 | 0.7317 | 0.7309 | 0.7536 | 0.7317 | | 0.6968 | 0.4237 | 2600 | 0.6448 | 0.7297 | 0.7305 | 0.7484 | 0.7297 | | 0.7011 | 0.4563 | 2800 | 0.6169 | 0.7398 | 0.7403 | 0.7458 | 0.7398 | | 0.6949 | 0.4889 | 3000 | 0.6200 | 0.7482 | 0.7483 | 0.7530 | 0.7482 | | 0.7042 | 0.5215 | 3200 | 0.6267 | 0.7402 | 0.7406 | 0.7592 | 0.7402 | | 0.6884 | 0.5541 | 3400 | 0.6222 | 0.7494 | 0.7487 | 0.7584 | 0.7494 | | 0.655 | 0.5867 | 3600 | 0.6460 | 0.7337 | 0.7333 | 0.7485 | 0.7337 | | 0.6745 | 0.6193 | 3800 | 0.6133 | 0.7538 | 0.7537 | 0.7574 | 0.7538 | | 0.6809 | 0.6519 | 4000 | 0.6338 | 0.7442 | 0.7436 | 0.7544 | 0.7442 | | 0.6674 | 0.6845 | 4200 | 0.6118 | 0.7494 | 0.7506 | 0.7588 | 0.7494 | | 0.6815 | 0.7171 | 4400 | 0.6173 | 0.7462 | 0.7477 | 0.7587 | 0.7462 | | 0.652 | 0.7497 | 4600 | 0.5969 | 0.7659 | 0.7656 | 0.7691 | 0.7659 | | 0.6517 | 0.7823 | 4800 | 0.6170 | 0.7506 | 0.7515 | 0.7615 | 0.7506 | | 0.6335 | 0.8149 | 5000 | 0.5767 | 0.7731 | 0.7736 | 0.7763 | 0.7731 | | 0.6362 | 0.8475 | 5200 | 0.6273 | 0.7542 | 0.7550 | 0.7676 | 0.7542 | | 0.6638 | 0.8801 | 5400 | 0.5773 | 0.7743 | 0.7753 | 0.7795 | 0.7743 | | 0.6369 | 0.9126 | 5600 | 0.5980 | 0.7534 | 0.7552 | 0.7673 | 0.7534 | | 0.6551 | 0.9452 | 5800 | 0.5927 | 0.7526 | 0.7546 | 0.7732 | 0.7526 | | 0.6549 | 0.9778 | 6000 | 0.5673 | 0.7618 | 0.7634 | 0.7709 | 0.7618 | | 0.5314 | 1.0104 | 6200 | 0.6203 | 0.7590 | 0.7589 | 0.7670 | 0.7590 | | 0.5127 | 1.0430 | 6400 | 0.5939 | 0.7663 | 0.7665 | 0.7697 | 0.7663 | | 0.5405 | 1.0756 | 6600 | 0.6012 | 0.7594 | 0.7605 | 0.7714 | 0.7594 | | 0.5618 | 1.1082 | 6800 | 0.6069 | 0.7614 | 0.7621 | 0.7682 | 0.7614 | | 0.5509 | 1.1408 | 7000 | 0.6226 | 0.7538 | 0.7552 | 0.7754 | 0.7538 | | 0.5501 | 1.1734 | 7200 | 0.5793 | 0.7703 | 0.7715 | 0.7765 | 0.7703 | | 0.5476 | 1.2060 | 7400 | 0.5969 | 0.7627 | 0.7617 | 0.7703 | 0.7627 | | 0.5434 | 1.2386 | 7600 | 0.5980 | 0.7578 | 0.7590 | 0.7753 | 0.7578 | | 0.5606 | 1.2712 | 7800 | 0.6319 | 0.7518 | 0.7502 | 0.7659 | 0.7518 | | 0.5449 | 1.3038 | 8000 | 0.5945 | 0.7574 | 0.7578 | 0.7652 | 0.7574 | | 0.5099 | 1.3364 | 8200 | 0.6824 | 0.7426 | 0.7427 | 0.7685 | 0.7426 | | 0.5406 | 1.3690 | 8400 | 0.5831 | 0.7695 | 0.7702 | 0.7737 | 0.7695 | | 0.5577 | 1.4016 | 8600 | 0.6264 | 0.7490 | 0.7483 | 0.7687 | 0.7490 | | 0.5502 | 1.4342 | 8800 | 0.5838 | 0.7647 | 0.7644 | 0.7689 | 0.7647 | | 0.527 | 1.4668 | 9000 | 0.5837 | 0.7675 | 0.7679 | 0.7705 | 0.7675 | | 0.5066 | 1.4993 | 9200 | 0.5884 | 0.7651 | 0.7660 | 0.7728 | 0.7651 | | 0.5391 | 1.5319 | 9400 | 0.5754 | 0.7659 | 0.7665 | 0.7697 | 0.7659 | | 0.5276 | 1.5645 | 9600 | 0.5743 | 0.7795 | 0.7803 | 0.7830 | 0.7795 | | 0.5329 | 1.5971 | 9800 | 0.5865 | 0.7570 | 0.7585 | 0.7691 | 0.7570 | | 0.5467 | 1.6297 | 10000 | 0.6229 | 0.7586 | 0.7598 | 0.7695 | 0.7586 | | 0.5373 | 1.6623 | 10200 | 0.6006 | 0.7602 | 0.7610 | 0.7665 | 0.7602 | | 0.517 | 1.6949 | 10400 | 0.6037 | 0.7502 | 0.7517 | 0.7668 | 0.7502 | | 0.5068 | 1.7275 | 10600 | 0.5945 | 0.7655 | 0.7659 | 0.7729 | 0.7655 | | 0.5491 | 1.7601 | 10800 | 0.6104 | 0.7602 | 0.7615 | 0.7730 | 0.7602 | | 0.5282 | 1.7927 | 11000 | 0.5829 | 0.7659 | 0.7666 | 0.7781 | 0.7659 | | 0.5359 | 1.8253 | 11200 | 0.6102 | 0.7622 | 0.7620 | 0.7754 | 0.7622 | | 0.549 | 1.8579 | 11400 | 0.5678 | 0.7643 | 0.7652 | 0.7724 | 0.7643 | | 0.525 | 1.8905 | 11600 | 0.6133 | 0.7627 | 0.7635 | 0.7791 | 0.7627 | | 0.5297 | 1.9231 | 11800 | 0.5893 | 0.7675 | 0.7679 | 0.7745 | 0.7675 | | 0.5438 | 1.9557 | 12000 | 0.5637 | 0.7731 | 0.7740 | 0.7804 | 0.7731 | | 0.5426 | 1.9883 | 12200 | 0.5937 | 0.7622 | 0.7624 | 0.7731 | 0.7622 | | 0.3892 | 2.0209 | 12400 | 0.6167 | 0.7719 | 0.7725 | 0.7766 | 0.7719 | | 0.3618 | 2.0535 | 12600 | 0.7019 | 0.7687 | 0.7695 | 0.7759 | 0.7687 | | 0.392 | 2.0860 | 12800 | 0.7179 | 0.7534 | 0.7551 | 0.7795 | 0.7534 | | 0.3912 | 2.1186 | 13000 | 0.6969 | 0.7518 | 0.7526 | 0.7715 | 0.7518 | | 0.3798 | 2.1512 | 13200 | 0.6487 | 0.7715 | 0.7725 | 0.7800 | 0.7715 | | 0.3856 | 2.1838 | 13400 | 0.6196 | 0.7671 | 0.7677 | 0.7709 | 0.7671 | | 0.358 | 2.2164 | 13600 | 0.7144 | 0.7574 | 0.7574 | 0.7705 | 0.7574 | | 0.3854 | 2.2490 | 13800 | 0.6709 | 0.7598 | 0.7598 | 0.7753 | 0.7598 | | 0.3687 | 2.2816 | 14000 | 0.6448 | 0.7631 | 0.7633 | 0.7705 | 0.7631 | | 0.3746 | 2.3142 | 14200 | 0.6617 | 0.7723 | 0.7728 | 0.7785 | 0.7723 | | 0.3798 | 2.3468 | 14400 | 0.6468 | 0.7727 | 0.7736 | 0.7814 | 0.7727 | | 0.3779 | 2.3794 | 14600 | 0.6503 | 0.7691 | 0.7693 | 0.7773 | 0.7691 | | 0.3871 | 2.4120 | 14800 | 0.6631 | 0.7618 | 0.7614 | 0.7702 | 0.7618 | | 0.3859 | 2.4446 | 15000 | 0.6825 | 0.7635 | 0.7641 | 0.7772 | 0.7635 | | 0.4049 | 2.4772 | 15200 | 0.6647 | 0.7655 | 0.7653 | 0.7749 | 0.7655 | | 0.3812 | 2.5098 | 15400 | 0.7008 | 0.7558 | 0.7563 | 0.7697 | 0.7558 | | 0.3874 | 2.5424 | 15600 | 0.6808 | 0.7671 | 0.7677 | 0.7764 | 0.7671 | ### Framework versions - Transformers 4.48.0.dev0 - Pytorch 2.4.1+cu121 - Datasets 3.1.0 - Tokenizers 0.21.0