--- base_model: yhavinga/ul2-large-dutch library_name: peft license: apache-2.0 tags: - generated_from_trainer model-index: - name: ul2-large-dutch-finetuned-oba-book-search results: [] --- # ul2-large-dutch-finetuned-oba-book-search This model is a fine-tuned version of [yhavinga/ul2-large-dutch](https://huggingface.co/yhavinga/ul2-large-dutch) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.9161 - Top-5-accuracy: 4.3582 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.3 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 1000 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Top-5-accuracy | |:-------------:|:------:|:-----:|:---------------:|:--------------:| | 6.7818 | 0.0424 | 500 | 4.6703 | 0.0995 | | 6.1358 | 0.0848 | 1000 | 4.6934 | 0.0 | | 5.7889 | 0.1272 | 1500 | 4.4938 | 0.0995 | | 5.5769 | 0.1696 | 2000 | 4.4436 | 0.1592 | | 5.5432 | 0.2120 | 2500 | 4.5134 | 0.1393 | | 5.4227 | 0.2544 | 3000 | 4.3984 | 0.3184 | | 5.462 | 0.2968 | 3500 | 4.3750 | 0.3582 | | 5.3557 | 0.3392 | 4000 | 4.3295 | 0.7363 | | 5.3069 | 0.3815 | 4500 | 4.3453 | 0.5771 | | 5.1367 | 0.4239 | 5000 | 4.2607 | 1.3930 | | 5.2098 | 0.4663 | 5500 | 4.2917 | 1.2935 | | 5.2545 | 0.5087 | 6000 | 4.2551 | 2.2687 | | 5.161 | 0.5511 | 6500 | 4.2483 | 2.0896 | | 5.0716 | 0.5935 | 7000 | 4.2450 | 2.6866 | | 5.0687 | 0.6359 | 7500 | 4.2535 | 2.4080 | | 5.0434 | 0.6783 | 8000 | 4.1765 | 3.1443 | | 5.0132 | 0.7207 | 8500 | 4.1843 | 3.2836 | | 4.9609 | 0.7631 | 9000 | 4.1817 | 3.3632 | | 4.983 | 0.8055 | 9500 | 4.1351 | 3.9403 | | 4.9225 | 0.8479 | 10000 | 4.0950 | 4.1194 | | 4.8771 | 0.8903 | 10500 | 4.1032 | 4.1194 | | 4.8627 | 0.9327 | 11000 | 4.1021 | 3.9204 | | 4.8935 | 0.9751 | 11500 | 4.0641 | 4.1791 | | 4.8718 | 1.0175 | 12000 | 4.0695 | 4.1592 | | 4.8557 | 1.0599 | 12500 | 4.0757 | 4.0597 | | 4.8136 | 1.1023 | 13000 | 4.0492 | 4.1990 | | 4.8281 | 1.1446 | 13500 | 4.0538 | 4.0 | | 4.8293 | 1.1870 | 14000 | 4.0315 | 4.3184 | | 4.831 | 1.2294 | 14500 | 4.0417 | 4.1194 | | 4.8232 | 1.2718 | 15000 | 4.0157 | 4.3383 | | 4.7911 | 1.3142 | 15500 | 4.0246 | 4.3383 | | 4.7865 | 1.3566 | 16000 | 3.9911 | 4.4975 | | 4.8019 | 1.3990 | 16500 | 4.0177 | 4.2786 | | 4.796 | 1.4414 | 17000 | 4.0278 | 4.3582 | | 4.8138 | 1.4838 | 17500 | 3.9919 | 4.2587 | | 4.7367 | 1.5262 | 18000 | 3.9809 | 4.4378 | | 4.757 | 1.5686 | 18500 | 3.9729 | 4.4179 | | 4.7352 | 1.6110 | 19000 | 3.9750 | 4.3980 | | 4.7663 | 1.6534 | 19500 | 3.9824 | 4.3184 | | 4.6772 | 1.6958 | 20000 | 3.9843 | 4.3383 | | 4.7573 | 1.7382 | 20500 | 3.9641 | 4.2189 | | 4.7402 | 1.7806 | 21000 | 3.9654 | 4.3980 | | 4.7006 | 1.8230 | 21500 | 3.9557 | 4.2587 | | 4.7047 | 1.8654 | 22000 | 3.9606 | 4.2985 | | 4.6683 | 1.9077 | 22500 | 3.9558 | 4.2985 | | 4.725 | 1.9501 | 23000 | 3.9382 | 4.3383 | | 4.7176 | 1.9925 | 23500 | 3.9422 | 4.4776 | | 4.7194 | 2.0349 | 24000 | 3.9445 | 4.2985 | | 4.6886 | 2.0773 | 24500 | 3.9368 | 4.4378 | | 4.6876 | 2.1197 | 25000 | 3.9245 | 4.4179 | | 4.6877 | 2.1621 | 25500 | 3.9326 | 4.3383 | | 4.7219 | 2.2045 | 26000 | 3.9296 | 4.3781 | | 4.6815 | 2.2469 | 26500 | 3.9279 | 4.3184 | | 4.6839 | 2.2893 | 27000 | 3.9276 | 4.2587 | | 4.6103 | 2.3317 | 27500 | 3.9251 | 4.2985 | | 4.6566 | 2.3741 | 28000 | 3.9307 | 4.2985 | | 4.6523 | 2.4165 | 28500 | 3.9236 | 4.2587 | | 4.6363 | 2.4589 | 29000 | 3.9193 | 4.2786 | | 4.6575 | 2.5013 | 29500 | 3.9185 | 4.2388 | | 4.6161 | 2.5437 | 30000 | 3.9227 | 4.3184 | | 4.644 | 2.5861 | 30500 | 3.9162 | 4.2985 | | 4.6537 | 2.6285 | 31000 | 3.9169 | 4.3781 | | 4.6405 | 2.6708 | 31500 | 3.9214 | 4.3781 | | 4.7401 | 2.7132 | 32000 | 3.9191 | 4.3781 | | 4.6907 | 2.7556 | 32500 | 3.9161 | 4.3582 | | 4.7139 | 2.7980 | 33000 | 3.9169 | 4.3781 | | 4.6537 | 2.8404 | 33500 | 3.9171 | 4.3582 | | 4.7395 | 2.8828 | 34000 | 3.9162 | 4.3383 | | 4.58 | 2.9252 | 34500 | 3.9162 | 4.3582 | | 4.6671 | 2.9676 | 35000 | 3.9161 | 4.3582 | ### Framework versions - PEFT 0.11.0 - Transformers 4.44.2 - Pytorch 1.13.0+cu116 - Datasets 3.0.0 - Tokenizers 0.19.1