Edit model card

bertmulti-finetuned-token-reqadjzar

This model is a fine-tuned version of bert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0215
  • Precision: 0.3729
  • Recall: 0.4783
  • F1: 0.4190
  • Accuracy: 0.8899

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
0.71 1.0 24 0.6444 0.0 0.0 0.0 0.5695
0.5158 2.0 48 0.7621 0.0 0.0 0.0 0.6544
0.4702 3.0 72 0.4411 0.0 0.0 0.0 0.7961
0.3326 4.0 96 0.6711 0.0 0.0 0.0 0.7482
0.3134 5.0 120 0.4577 0.0 0.0 0.0 0.8160
0.2036 6.0 144 0.7211 0.0266 0.1087 0.0427 0.7232
0.1918 7.0 168 0.5596 0.0342 0.1087 0.0521 0.8284
0.1388 8.0 192 0.6039 0.0539 0.1957 0.0845 0.8242
0.1525 9.0 216 0.4642 0.0743 0.2391 0.1134 0.8436
0.0965 10.0 240 0.4855 0.1241 0.3913 0.1885 0.8758
0.0757 11.0 264 0.4353 0.0966 0.3043 0.1466 0.8543
0.0689 12.0 288 1.5112 0.0963 0.2826 0.1436 0.6792
0.1298 13.0 312 0.6653 0.1746 0.4783 0.2558 0.8430
0.0656 14.0 336 0.6830 0.0935 0.2826 0.1405 0.8443
0.041 15.0 360 0.6495 0.1231 0.3478 0.1818 0.8411
0.039 16.0 384 0.5259 0.1377 0.4130 0.2065 0.8585
0.031 17.0 408 0.6282 0.2054 0.5 0.2911 0.8479
0.0615 18.0 432 0.6453 0.1959 0.4130 0.2657 0.8559
0.0282 19.0 456 0.7038 0.1324 0.3913 0.1978 0.8462
0.0204 20.0 480 0.5582 0.1759 0.4130 0.2468 0.8669
0.0202 21.0 504 0.9386 0.1852 0.3261 0.2362 0.8224
0.0255 22.0 528 0.8781 0.1714 0.3913 0.2384 0.7990
0.0281 23.0 552 0.6537 0.1875 0.4565 0.2658 0.8833
0.0303 24.0 576 0.6140 0.2319 0.3478 0.2783 0.8735
0.0404 25.0 600 0.8073 0.2062 0.4348 0.2797 0.7813
0.0378 26.0 624 0.7925 0.1852 0.4348 0.2597 0.8601
0.0185 27.0 648 0.6131 0.1835 0.4348 0.2581 0.8879
0.0217 28.0 672 0.8677 0.2347 0.5 0.3194 0.8151
0.0147 29.0 696 0.6513 0.2 0.3913 0.2647 0.8622
0.0228 30.0 720 0.8354 0.2787 0.3696 0.3178 0.8490
0.0298 31.0 744 0.7063 0.1858 0.4565 0.2642 0.8858
0.018 32.0 768 0.9348 0.3188 0.4783 0.3826 0.8688
0.0179 33.0 792 0.7260 0.3014 0.4783 0.3697 0.8814
0.0139 34.0 816 1.6266 0.2427 0.5435 0.3356 0.7781
0.0435 35.0 840 0.5877 0.225 0.3913 0.2857 0.8827
0.0172 36.0 864 0.9602 0.1545 0.4130 0.2249 0.8379
0.02 37.0 888 0.7676 0.2473 0.5 0.3309 0.8696
0.012 38.0 912 0.6857 0.2118 0.3913 0.2748 0.8836
0.0088 39.0 936 0.8894 0.2857 0.5217 0.3692 0.8697
0.0055 40.0 960 0.7700 0.2340 0.4783 0.3143 0.8678
0.014 41.0 984 0.9191 0.2692 0.4565 0.3387 0.8396
0.0061 42.0 1008 0.8042 0.2785 0.4783 0.352 0.8753
0.0059 43.0 1032 1.0644 0.1959 0.4130 0.2657 0.8203
0.0032 44.0 1056 1.1174 0.2949 0.5 0.3710 0.8242
0.0057 45.0 1080 1.3623 0.2963 0.5217 0.3780 0.8346
0.019 46.0 1104 1.0958 0.1932 0.3696 0.2537 0.8465
0.0167 47.0 1128 0.9388 0.1848 0.3696 0.2464 0.8355
0.0131 48.0 1152 1.2771 0.1826 0.4565 0.2609 0.7805
0.0095 49.0 1176 1.0477 0.1944 0.4565 0.2727 0.8332
0.0058 50.0 1200 0.9822 0.2941 0.5435 0.3817 0.8407
0.0056 51.0 1224 1.1512 0.2360 0.4565 0.3111 0.8361
0.0045 52.0 1248 0.8875 0.2468 0.4130 0.3089 0.8693
0.0035 53.0 1272 0.9689 0.2346 0.4130 0.2992 0.8551
0.0066 54.0 1296 0.9921 0.2299 0.4348 0.3008 0.8587
0.0026 55.0 1320 0.8510 0.2817 0.4348 0.3419 0.8758
0.0033 56.0 1344 0.9234 0.2115 0.4783 0.2933 0.8436
0.0125 57.0 1368 1.0792 0.2308 0.3913 0.2903 0.8486
0.0034 58.0 1392 1.1353 0.2609 0.5217 0.3478 0.8274
0.0065 59.0 1416 1.3812 0.2738 0.5 0.3538 0.7993
0.0082 60.0 1440 1.0929 0.2233 0.5 0.3087 0.8429
0.0202 61.0 1464 0.9371 0.1709 0.4348 0.2454 0.8399
0.0063 62.0 1488 0.6318 0.2099 0.3696 0.2677 0.8543
0.0047 63.0 1512 0.8257 0.2018 0.5 0.2875 0.8514
0.0036 64.0 1536 0.8545 0.1963 0.4565 0.2745 0.8484
0.0027 65.0 1560 0.8684 0.2421 0.5 0.3262 0.8539
0.002 66.0 1584 0.8609 0.25 0.5 0.3333 0.8630
0.0022 67.0 1608 0.7618 0.2347 0.5 0.3194 0.8804
0.0026 68.0 1632 0.8460 0.23 0.5 0.3151 0.8654
0.0019 69.0 1656 0.7437 0.2857 0.5217 0.3692 0.8933
0.0027 70.0 1680 0.7911 0.2727 0.4565 0.3415 0.8898
0.0025 71.0 1704 0.8172 0.3333 0.4783 0.3929 0.8880
0.0037 72.0 1728 0.7807 0.2680 0.5652 0.3636 0.8873
0.0032 73.0 1752 0.9164 0.2683 0.4783 0.3438 0.8760
0.0092 74.0 1776 0.6410 0.2976 0.5435 0.3846 0.8836
0.0029 75.0 1800 0.7780 0.2857 0.5217 0.3692 0.8854
0.0017 76.0 1824 0.9096 0.2683 0.4783 0.3438 0.8656
0.0017 77.0 1848 0.8843 0.2911 0.5 0.368 0.8773
0.0019 78.0 1872 0.7888 0.2410 0.4348 0.3101 0.8613
0.0032 79.0 1896 0.9426 0.2241 0.5652 0.3210 0.8490
0.0019 80.0 1920 0.9566 0.25 0.3913 0.3051 0.8708
0.0017 81.0 1944 1.0507 0.2588 0.4783 0.3359 0.8669
0.0015 82.0 1968 1.1118 0.2174 0.4348 0.2899 0.8614
0.0014 83.0 1992 1.1422 0.2299 0.4348 0.3008 0.8548
0.0015 84.0 2016 1.1422 0.2716 0.4783 0.3465 0.8556
0.0013 85.0 2040 1.0874 0.2371 0.5 0.3217 0.8557
0.0015 86.0 2064 1.0420 0.2277 0.5 0.3129 0.8624
0.0013 87.0 2088 1.0851 0.2418 0.4783 0.3212 0.8579
0.0015 88.0 2112 1.1249 0.2556 0.5 0.3382 0.8622
0.0015 89.0 2136 1.0589 0.2667 0.5217 0.3529 0.8617
0.0014 90.0 2160 1.0879 0.2674 0.5 0.3485 0.8497
0.0019 91.0 2184 1.0425 0.2651 0.4783 0.3411 0.8551
0.0015 92.0 2208 1.0137 0.2716 0.4783 0.3465 0.8579
0.0015 93.0 2232 1.0084 0.2716 0.4783 0.3465 0.8619
0.0015 94.0 2256 1.0231 0.2727 0.5217 0.3582 0.8529
0.0014 95.0 2280 1.1031 0.3067 0.5 0.3802 0.8522
0.0014 96.0 2304 1.0001 0.2796 0.5652 0.3741 0.8642
0.0012 97.0 2328 1.0274 0.3253 0.5870 0.4186 0.8683
0.0015 98.0 2352 1.1420 0.3559 0.4565 0.4000 0.8579
0.0154 99.0 2376 0.8248 0.4706 0.5217 0.4948 0.8894
0.0041 100.0 2400 0.8580 0.2892 0.5217 0.3721 0.8768
0.0046 101.0 2424 1.0790 0.1792 0.4130 0.25 0.8623
0.0021 102.0 2448 1.0016 0.25 0.4348 0.3175 0.8766
0.0028 103.0 2472 0.8267 0.2899 0.4348 0.3478 0.8907
0.0026 104.0 2496 1.1740 0.2212 0.5 0.3067 0.8511
0.0018 105.0 2520 1.2264 0.1759 0.4130 0.2468 0.8389
0.0017 106.0 2544 1.1772 0.2451 0.5435 0.3378 0.8468
0.0014 107.0 2568 1.2155 0.2556 0.5 0.3382 0.8386
0.0018 108.0 2592 1.1990 0.2558 0.4783 0.3333 0.8411
0.0022 109.0 2616 1.0769 0.3425 0.5435 0.4202 0.8679
0.0016 110.0 2640 1.0793 0.3538 0.5 0.4144 0.8629
0.0019 111.0 2664 0.8828 0.2680 0.5652 0.3636 0.8823
0.0014 112.0 2688 1.0073 0.3548 0.4783 0.4074 0.8810
0.0016 113.0 2712 0.9562 0.3667 0.4783 0.4151 0.8827
0.0014 114.0 2736 0.9590 0.3438 0.4783 0.4 0.8802
0.0014 115.0 2760 1.0293 0.4 0.5217 0.4528 0.8814
0.0014 116.0 2784 1.0419 0.4068 0.5217 0.4571 0.8804
0.0012 117.0 2808 1.0451 0.4138 0.5217 0.4615 0.8805
0.005 118.0 2832 1.0514 0.4068 0.5217 0.4571 0.8803
0.0019 119.0 2856 1.0440 0.4068 0.5217 0.4571 0.8805
0.0015 120.0 2880 1.0782 0.4 0.5217 0.4528 0.8768
0.0015 121.0 2904 1.0736 0.4211 0.5217 0.4660 0.8765
0.0014 122.0 2928 1.0565 0.3934 0.5217 0.4486 0.8776
0.0013 123.0 2952 1.0496 0.4444 0.5217 0.48 0.8814
0.0012 124.0 2976 1.0805 0.4286 0.5217 0.4706 0.8805
0.0012 125.0 3000 1.1119 0.4211 0.5217 0.4660 0.8809
0.0013 126.0 3024 1.0880 0.4528 0.5217 0.4848 0.8812
0.0014 127.0 3048 1.0198 0.3729 0.4783 0.4190 0.8796
0.0013 128.0 3072 1.0028 0.4 0.5217 0.4528 0.8790
0.0014 129.0 3096 1.0229 0.3529 0.5217 0.4211 0.8835
0.0013 130.0 3120 1.0440 0.3380 0.5217 0.4103 0.8747
0.0013 131.0 3144 1.1109 0.4615 0.5217 0.4898 0.8781
0.0012 132.0 3168 1.1082 0.4706 0.5217 0.4948 0.8812
0.0013 133.0 3192 1.1031 0.4444 0.5217 0.48 0.8806
0.0011 134.0 3216 1.1345 0.3529 0.5217 0.4211 0.8713
0.0012 135.0 3240 1.1631 0.3485 0.5 0.4107 0.8716
0.0012 136.0 3264 1.1461 0.3429 0.5217 0.4138 0.8708
0.0012 137.0 3288 1.1592 0.4138 0.5217 0.4615 0.8683
0.0012 138.0 3312 1.0969 0.4138 0.5217 0.4615 0.8754
0.0013 139.0 3336 1.0575 0.3429 0.5217 0.4138 0.8787
0.0013 140.0 3360 1.0560 0.3636 0.5217 0.4286 0.8826
0.0013 141.0 3384 1.0525 0.3380 0.5217 0.4103 0.8796
0.0011 142.0 3408 1.0548 0.3380 0.5217 0.4103 0.8792
0.0013 143.0 3432 1.0593 0.3478 0.5217 0.4174 0.8802
0.0012 144.0 3456 1.0402 0.375 0.5217 0.4364 0.8827
0.0011 145.0 3480 1.0401 0.375 0.5217 0.4364 0.8828
0.0012 146.0 3504 1.0319 0.3810 0.5217 0.4404 0.8840
0.0012 147.0 3528 1.0328 0.3692 0.5217 0.4324 0.8838
0.0012 148.0 3552 1.1021 0.3433 0.5 0.4071 0.8730
0.0012 149.0 3576 1.0402 0.3485 0.5 0.4107 0.8817
0.0013 150.0 3600 0.9619 0.3086 0.5435 0.3937 0.8883
0.0014 151.0 3624 0.9578 0.3382 0.5 0.4035 0.8843
0.0012 152.0 3648 1.0303 0.3692 0.5217 0.4324 0.8830
0.0013 153.0 3672 1.0571 0.3934 0.5217 0.4486 0.8812
0.0012 154.0 3696 1.0793 0.3692 0.5217 0.4324 0.8812
0.0011 155.0 3720 1.0766 0.375 0.5217 0.4364 0.8803
0.0011 156.0 3744 1.0824 0.3934 0.5217 0.4486 0.8810
0.0012 157.0 3768 1.0841 0.4 0.5217 0.4528 0.8810
0.0011 158.0 3792 1.0866 0.4068 0.5217 0.4571 0.8812
0.0012 159.0 3816 1.1016 0.4 0.5217 0.4528 0.8808
0.0011 160.0 3840 1.1114 0.3810 0.5217 0.4404 0.8793
0.0013 161.0 3864 1.1427 0.2892 0.5217 0.3721 0.8577
0.0011 162.0 3888 1.0292 0.3582 0.5217 0.4248 0.8875
0.0012 163.0 3912 0.9894 0.375 0.5217 0.4364 0.8872
0.0011 164.0 3936 0.9877 0.3636 0.5217 0.4286 0.8870
0.0011 165.0 3960 0.9887 0.3692 0.5217 0.4324 0.8890
0.0012 166.0 3984 0.9874 0.3243 0.5217 0.4 0.8871
0.0011 167.0 4008 0.9992 0.3636 0.5217 0.4286 0.8896
0.0012 168.0 4032 0.9835 0.3692 0.5217 0.4324 0.8903
0.0011 169.0 4056 0.9918 0.3284 0.4783 0.3894 0.8910
0.0011 170.0 4080 0.9960 0.3438 0.4783 0.4 0.8914
0.0011 171.0 4104 1.0065 0.3729 0.4783 0.4190 0.8915
0.0012 172.0 4128 1.0266 0.3929 0.4783 0.4314 0.8908
0.0011 173.0 4152 1.0318 0.3929 0.4783 0.4314 0.8908
0.0011 174.0 4176 1.0329 0.3793 0.4783 0.4231 0.8908
0.0012 175.0 4200 1.0254 0.3860 0.4783 0.4272 0.8910
0.0011 176.0 4224 1.0183 0.4 0.4783 0.4356 0.8912
0.0012 177.0 4248 1.0205 0.3860 0.4783 0.4272 0.8909
0.0011 178.0 4272 1.0232 0.3793 0.4783 0.4231 0.8908
0.0011 179.0 4296 1.0246 0.3860 0.4783 0.4272 0.8908
0.0012 180.0 4320 1.0245 0.3793 0.4783 0.4231 0.8905
0.0012 181.0 4344 1.0223 0.375 0.4565 0.4118 0.8902
0.0011 182.0 4368 1.0169 0.3929 0.4783 0.4314 0.8894
0.0011 183.0 4392 1.0172 0.3929 0.4783 0.4314 0.8893
0.0012 184.0 4416 1.0147 0.3860 0.4783 0.4272 0.8894
0.0012 185.0 4440 1.0145 0.3860 0.4783 0.4272 0.8894
0.0011 186.0 4464 1.0128 0.3729 0.4783 0.4190 0.8897
0.0011 187.0 4488 1.0146 0.3729 0.4783 0.4190 0.8897
0.0011 188.0 4512 1.0160 0.3729 0.4783 0.4190 0.8897
0.0011 189.0 4536 1.0178 0.3729 0.4783 0.4190 0.8898
0.0011 190.0 4560 1.0185 0.3729 0.4783 0.4190 0.8898
0.0011 191.0 4584 1.0171 0.3793 0.4783 0.4231 0.8899
0.0011 192.0 4608 1.0179 0.3729 0.4783 0.4190 0.8898
0.0011 193.0 4632 1.0196 0.3793 0.4783 0.4231 0.8899
0.0012 194.0 4656 1.0188 0.3793 0.4783 0.4231 0.8899
0.0011 195.0 4680 1.0185 0.3793 0.4783 0.4231 0.8899
0.0011 196.0 4704 1.0194 0.3860 0.4783 0.4272 0.8898
0.0011 197.0 4728 1.0206 0.3729 0.4783 0.4190 0.8900
0.0011 198.0 4752 1.0207 0.3729 0.4783 0.4190 0.8900
0.0012 199.0 4776 1.0215 0.3729 0.4783 0.4190 0.8899
0.0011 200.0 4800 1.0215 0.3729 0.4783 0.4190 0.8899

Framework versions

  • Transformers 4.31.0.dev0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.13.3
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.