Layout-finetuned-fr-model-50instances20-100epochs-5e-05lr
This model is a fine-tuned version of microsoft/layoutxlm-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0000
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: reduce_lr_on_plateau
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
3.3707 | 0.7692 | 10 | 0.8298 |
0.33 | 1.5385 | 20 | 0.0024 |
0.0022 | 2.3077 | 30 | 0.0003 |
0.0814 | 3.0769 | 40 | 0.0002 |
0.0004 | 3.8462 | 50 | 0.0001 |
0.0003 | 4.6154 | 60 | 0.0001 |
0.0002 | 5.3846 | 70 | 0.0001 |
0.0002 | 6.1538 | 80 | 0.0001 |
0.0002 | 6.9231 | 90 | 0.0001 |
0.0002 | 7.6923 | 100 | 0.0001 |
0.0002 | 8.4615 | 110 | 0.0001 |
0.0002 | 9.2308 | 120 | 0.0001 |
0.0001 | 10.0 | 130 | 0.0001 |
0.0001 | 10.7692 | 140 | 0.0001 |
0.0001 | 11.5385 | 150 | 0.0001 |
0.0001 | 12.3077 | 160 | 0.0001 |
0.0001 | 13.0769 | 170 | 0.0001 |
0.0001 | 13.8462 | 180 | 0.0001 |
0.0001 | 14.6154 | 190 | 0.0000 |
0.0001 | 15.3846 | 200 | 0.0000 |
0.0001 | 16.1538 | 210 | 0.0000 |
0.0001 | 16.9231 | 220 | 0.0000 |
0.0001 | 17.6923 | 230 | 0.0000 |
0.0001 | 18.4615 | 240 | 0.0000 |
0.0001 | 19.2308 | 250 | 0.0000 |
0.0001 | 20.0 | 260 | 0.0000 |
0.0001 | 20.7692 | 270 | 0.0000 |
0.0001 | 21.5385 | 280 | 0.0000 |
0.0001 | 22.3077 | 290 | 0.0000 |
0.0001 | 23.0769 | 300 | 0.0000 |
0.0001 | 23.8462 | 310 | 0.0000 |
0.0001 | 24.6154 | 320 | 0.0000 |
0.0001 | 25.3846 | 330 | 0.0000 |
0.0001 | 26.1538 | 340 | 0.0000 |
0.0001 | 26.9231 | 350 | 0.0000 |
0.0001 | 27.6923 | 360 | 0.0000 |
0.0001 | 28.4615 | 370 | 0.0000 |
0.0001 | 29.2308 | 380 | 0.0000 |
0.0001 | 30.0 | 390 | 0.0000 |
0.0001 | 30.7692 | 400 | 0.0000 |
0.0001 | 31.5385 | 410 | 0.0000 |
0.0001 | 32.3077 | 420 | 0.0000 |
0.0001 | 33.0769 | 430 | 0.0000 |
0.0001 | 33.8462 | 440 | 0.0000 |
0.0001 | 34.6154 | 450 | 0.0000 |
0.0001 | 35.3846 | 460 | 0.0000 |
0.0001 | 36.1538 | 470 | 0.0000 |
0.0 | 36.9231 | 480 | 0.0000 |
0.0 | 37.6923 | 490 | 0.0000 |
0.0 | 38.4615 | 500 | 0.0000 |
0.0 | 39.2308 | 510 | 0.0000 |
0.0 | 40.0 | 520 | 0.0000 |
0.0 | 40.7692 | 530 | 0.0000 |
0.0 | 41.5385 | 540 | 0.0000 |
0.0 | 42.3077 | 550 | 0.0000 |
0.0 | 43.0769 | 560 | 0.0000 |
0.0 | 43.8462 | 570 | 0.0000 |
0.0 | 44.6154 | 580 | 0.0000 |
0.0 | 45.3846 | 590 | 0.0000 |
0.0 | 46.1538 | 600 | 0.0000 |
0.0 | 46.9231 | 610 | 0.0000 |
0.0 | 47.6923 | 620 | 0.0000 |
0.0 | 48.4615 | 630 | 0.0000 |
0.0 | 49.2308 | 640 | 0.0000 |
0.0 | 50.0 | 650 | 0.0000 |
0.0 | 50.7692 | 660 | 0.0000 |
0.0 | 51.5385 | 670 | 0.0000 |
0.0 | 52.3077 | 680 | 0.0000 |
0.0 | 53.0769 | 690 | 0.0000 |
0.0 | 53.8462 | 700 | 0.0000 |
0.0 | 54.6154 | 710 | 0.0000 |
0.0 | 55.3846 | 720 | 0.0000 |
0.0 | 56.1538 | 730 | 0.0000 |
0.0 | 56.9231 | 740 | 0.0000 |
0.0 | 57.6923 | 750 | 0.0000 |
0.0 | 58.4615 | 760 | 0.0000 |
0.0 | 59.2308 | 770 | 0.0000 |
0.0 | 60.0 | 780 | 0.0000 |
0.0 | 60.7692 | 790 | 0.0000 |
0.0 | 61.5385 | 800 | 0.0000 |
0.0 | 62.3077 | 810 | 0.0000 |
0.0 | 63.0769 | 820 | 0.0000 |
0.0 | 63.8462 | 830 | 0.0000 |
0.0 | 64.6154 | 840 | 0.0000 |
0.0 | 65.3846 | 850 | 0.0000 |
0.0 | 66.1538 | 860 | 0.0000 |
0.0 | 66.9231 | 870 | 0.0000 |
0.0 | 67.6923 | 880 | 0.0000 |
0.0 | 68.4615 | 890 | 0.0000 |
0.0 | 69.2308 | 900 | 0.0000 |
0.0 | 70.0 | 910 | 0.0000 |
0.0 | 70.7692 | 920 | 0.0000 |
0.0 | 71.5385 | 930 | 0.0000 |
0.0 | 72.3077 | 940 | 0.0000 |
0.0 | 73.0769 | 950 | 0.0000 |
0.0 | 73.8462 | 960 | 0.0000 |
0.0 | 74.6154 | 970 | 0.0000 |
0.0 | 75.3846 | 980 | 0.0000 |
0.0 | 76.1538 | 990 | 0.0000 |
0.0 | 76.9231 | 1000 | 0.0000 |
0.0 | 77.6923 | 1010 | 0.0000 |
0.0 | 78.4615 | 1020 | 0.0000 |
0.0 | 79.2308 | 1030 | 0.0000 |
0.0 | 80.0 | 1040 | 0.0000 |
0.0 | 80.7692 | 1050 | 0.0000 |
0.0 | 81.5385 | 1060 | 0.0000 |
0.0 | 82.3077 | 1070 | 0.0000 |
0.0 | 83.0769 | 1080 | 0.0000 |
0.0 | 83.8462 | 1090 | 0.0000 |
0.0 | 84.6154 | 1100 | 0.0000 |
0.0 | 85.3846 | 1110 | 0.0000 |
0.0 | 86.1538 | 1120 | 0.0000 |
0.0 | 86.9231 | 1130 | 0.0000 |
0.0 | 87.6923 | 1140 | 0.0000 |
0.0 | 88.4615 | 1150 | 0.0000 |
0.0 | 89.2308 | 1160 | 0.0000 |
0.0 | 90.0 | 1170 | 0.0000 |
0.0 | 90.7692 | 1180 | 0.0000 |
0.0 | 91.5385 | 1190 | 0.0000 |
0.0 | 92.3077 | 1200 | 0.0000 |
0.0 | 93.0769 | 1210 | 0.0000 |
0.0 | 93.8462 | 1220 | 0.0000 |
0.0 | 94.6154 | 1230 | 0.0000 |
0.0 | 95.3846 | 1240 | 0.0000 |
0.0 | 96.1538 | 1250 | 0.0000 |
0.0 | 96.9231 | 1260 | 0.0000 |
0.0 | 97.6923 | 1270 | 0.0000 |
0.0 | 98.4615 | 1280 | 0.0000 |
0.0 | 99.2308 | 1290 | 0.0000 |
0.0 | 100.0 | 1300 | 0.0000 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.4.1.post100
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for AntonioTH/Layout-finetuned-fr-model-50instances20-100epochs-5e-05lr
Base model
microsoft/layoutxlm-base