Edit model card

bankstatementmodelver7

This model is a fine-tuned version of deepset/roberta-base-squad2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0745

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 11
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss
0.0981 1.0 532 0.0672
0.0425 2.0 1064 0.0565
0.0376 3.0 1596 0.0546
0.026 4.0 2128 0.0309
0.0258 5.0 2660 0.0258
0.0211 6.0 3192 0.0397
0.0184 7.0 3724 0.0549
0.0222 8.0 4256 0.0354
0.0191 9.0 4788 0.0216
0.0209 10.0 5320 0.0403
0.0142 11.0 5852 0.0325
0.0143 12.0 6384 0.0317
0.0139 13.0 6916 0.0337
0.0146 14.0 7448 0.0315
0.0142 15.0 7980 0.0321
0.0132 16.0 8512 0.0216
0.0118 17.0 9044 0.0337
0.0174 18.0 9576 0.0427
0.0141 19.0 10108 0.0326
0.0127 20.0 10640 0.0408
0.014 21.0 11172 0.0355
0.0098 22.0 11704 0.0300
0.0116 23.0 12236 0.0220
0.012 24.0 12768 0.0345
0.0135 25.0 13300 0.0351
0.01 26.0 13832 0.0282
0.0091 27.0 14364 0.0291
0.0094 28.0 14896 0.0512
0.0116 29.0 15428 0.0278
0.0077 30.0 15960 0.0447
0.0096 31.0 16492 0.0338
0.0097 32.0 17024 0.0302
0.0098 33.0 17556 0.0279
0.0093 34.0 18088 0.0260
0.0099 35.0 18620 0.0432
0.0104 36.0 19152 0.0297
0.0083 37.0 19684 0.0288
0.0076 38.0 20216 0.0404
0.0114 39.0 20748 0.0366
0.0073 40.0 21280 0.0381
0.0102 41.0 21812 0.0473
0.0082 42.0 22344 0.0386
0.0064 43.0 22876 0.0172
0.0081 44.0 23408 0.0626
0.0075 45.0 23940 0.0410
0.0077 46.0 24472 0.1468
0.0095 47.0 25004 0.0436
0.0068 48.0 25536 0.0494
0.0055 49.0 26068 0.0484
0.0051 50.0 26600 0.0438
0.004 51.0 27132 0.0398
0.0043 52.0 27664 0.0546
0.005 53.0 28196 0.0509
0.0033 54.0 28728 0.0510
0.0054 55.0 29260 0.0554
0.004 56.0 29792 0.0430
0.0037 57.0 30324 0.0622
0.0028 58.0 30856 0.0573
0.0055 59.0 31388 0.0585
0.002 60.0 31920 0.0508
0.005 61.0 32452 0.0648
0.0031 62.0 32984 0.0541
0.0039 63.0 33516 0.0567
0.0018 64.0 34048 0.0627
0.002 65.0 34580 0.0445
0.003 66.0 35112 0.0708
0.0009 67.0 35644 0.0528
0.0015 68.0 36176 0.0613
0.0019 69.0 36708 0.0576
0.0023 70.0 37240 0.0592
0.0018 71.0 37772 0.0499
0.0011 72.0 38304 0.0495
0.0014 73.0 38836 0.0463
0.0014 74.0 39368 0.0493
0.0017 75.0 39900 0.0532
0.0008 76.0 40432 0.0666
0.0005 77.0 40964 0.0514
0.002 78.0 41496 0.0702
0.0026 79.0 42028 0.0426
0.0001 80.0 42560 0.0481
0.0019 81.0 43092 0.0551
0.0001 82.0 43624 0.0550
0.0 83.0 44156 0.0613
0.0012 84.0 44688 0.0568
0.0006 85.0 45220 0.0602
0.0001 86.0 45752 0.0623
0.0004 87.0 46284 0.0522
0.0007 88.0 46816 0.0647
0.0001 89.0 47348 0.0593
0.0002 90.0 47880 0.0552
0.0016 91.0 48412 0.0475
0.0005 92.0 48944 0.0531
0.0011 93.0 49476 0.0574
0.0 94.0 50008 0.0591
0.0 95.0 50540 0.0606
0.0005 96.0 51072 0.0599
0.0018 97.0 51604 0.0505
0.0 98.0 52136 0.0568
0.0011 99.0 52668 0.0692
0.0 100.0 53200 0.0702
0.0002 101.0 53732 0.0743
0.0 102.0 54264 0.0822
0.0007 103.0 54796 0.0905
0.0001 104.0 55328 0.0822
0.0005 105.0 55860 0.0792
0.0004 106.0 56392 0.0683
0.0018 107.0 56924 0.0526
0.0029 108.0 57456 0.0600
0.0005 109.0 57988 0.0631
0.0 110.0 58520 0.0659
0.0006 111.0 59052 0.0663
0.0 112.0 59584 0.0681
0.0012 113.0 60116 0.0537
0.0 114.0 60648 0.0558
0.0 115.0 61180 0.0574
0.0006 116.0 61712 0.0563
0.0 117.0 62244 0.0479
0.0015 118.0 62776 0.0584
0.0 119.0 63308 0.0606
0.0 120.0 63840 0.0624
0.0006 121.0 64372 0.0655
0.0003 122.0 64904 0.0688
0.0 123.0 65436 0.0790
0.0001 124.0 65968 0.0713
0.0 125.0 66500 0.0721
0.0006 126.0 67032 0.0689
0.0 127.0 67564 0.0679
0.0 128.0 68096 0.0693
0.0005 129.0 68628 0.0688
0.0 130.0 69160 0.0696
0.0 131.0 69692 0.0702
0.0 132.0 70224 0.0715
0.0 133.0 70756 0.0727
0.0 134.0 71288 0.0708
0.0 135.0 71820 0.0715
0.0 136.0 72352 0.0724
0.0 137.0 72884 0.0762
0.0 138.0 73416 0.0797
0.0 139.0 73948 0.0800
0.0 140.0 74480 0.0808
0.0 141.0 75012 0.0834
0.0 142.0 75544 0.0833
0.0014 143.0 76076 0.0782
0.0 144.0 76608 0.0748
0.0 145.0 77140 0.0749
0.0 146.0 77672 0.0751
0.0 147.0 78204 0.0738
0.0 148.0 78736 0.0744
0.0 149.0 79268 0.0744
0.0 150.0 79800 0.0745

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Tokenizers 0.13.3
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Souvik123/bankstatementmodelver7

Finetuned
(181)
this model