COPAS-mms1ball-Nov29

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3656
  • Wer: 0.2399

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 3
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
39.0459 0.2358 100 32.7908 1.0
31.3076 0.4717 200 24.5080 1.0005
22.6116 0.7075 300 16.6952 1.0
14.3915 0.9434 400 9.8732 1.0
8.4511 1.1792 500 5.7561 1.0
4.8318 1.4151 600 3.8778 1.0
3.5683 1.6509 700 3.2052 1.0
3.1283 1.8868 800 3.0170 1.0
3.0561 2.1226 900 2.9596 1.0
2.9704 2.3585 1000 2.9107 1.0
2.9138 2.5943 1100 2.8762 1.0
2.9195 2.8302 1200 2.8405 1.0
2.8696 3.0660 1300 2.7997 1.0
2.824 3.3019 1400 2.7525 1.0
2.7823 3.5377 1500 2.6803 1.0
2.7284 3.7736 1600 2.5883 0.9997
2.6758 4.0094 1700 2.4691 0.9985
2.5467 4.2453 1800 2.3142 0.9871
2.3888 4.4811 1900 2.1150 0.9593
2.2207 4.7170 2000 1.8880 0.9007
2.0405 4.9528 2100 1.6605 0.8418
1.8474 5.1887 2200 1.4562 0.7697
1.6394 5.4245 2300 1.2640 0.6726
1.4813 5.6604 2400 1.1130 0.5468
1.3794 5.8962 2500 1.0028 0.4876
1.2862 6.1321 2600 0.9112 0.4381
1.1636 6.3679 2700 0.8412 0.4133
1.1022 6.6038 2800 0.7828 0.4019
1.1028 6.8396 2900 0.7427 0.3936
0.9554 7.0755 3000 0.7045 0.3873
0.9922 7.3113 3100 0.6779 0.3830
0.9837 7.5472 3200 0.6560 0.3799
0.9329 7.7830 3300 0.6360 0.3764
0.9373 8.0189 3400 0.6202 0.3749
0.8958 8.2547 3500 0.6060 0.3716
0.8097 8.4906 3600 0.5937 0.3703
0.8469 8.7264 3700 0.5847 0.3696
0.9508 8.9623 3800 0.5755 0.3686
0.7845 9.1981 3900 0.5669 0.3670
0.8738 9.4340 4000 0.5585 0.3648
0.8345 9.6698 4100 0.5528 0.3650
0.8014 9.9057 4200 0.5484 0.3635
0.852 10.1415 4300 0.5417 0.3607
0.8325 10.3774 4400 0.5368 0.3595
0.7868 10.6132 4500 0.5317 0.3562
0.7985 10.8491 4600 0.5285 0.3544
0.7187 11.0849 4700 0.5245 0.3534
0.764 11.3208 4800 0.5222 0.3516
0.7842 11.5566 4900 0.5188 0.3473
0.7214 11.7925 5000 0.5149 0.3478
0.8369 12.0283 5100 0.5108 0.3433
0.8013 12.2642 5200 0.5067 0.3415
0.7132 12.5 5300 0.5055 0.3385
0.7433 12.7358 5400 0.5037 0.3375
0.7471 12.9717 5500 0.4994 0.3357
0.7 13.2075 5600 0.4972 0.3344
0.764 13.4434 5700 0.4946 0.3334
0.7371 13.6792 5800 0.4917 0.3319
0.7365 13.9151 5900 0.4892 0.3274
0.7302 14.1509 6000 0.4870 0.3266
0.6932 14.3868 6100 0.4874 0.3243
0.7475 14.6226 6200 0.4852 0.3228
0.7309 14.8585 6300 0.4816 0.3210
0.7329 15.0943 6400 0.4789 0.3218
0.7439 15.3302 6500 0.4770 0.3195
0.719 15.5660 6600 0.4753 0.3175
0.6595 15.8019 6700 0.4742 0.3145
0.6897 16.0377 6800 0.4725 0.3127
0.689 16.2736 6900 0.4706 0.3097
0.7532 16.5094 7000 0.4683 0.3086
0.7149 16.7453 7100 0.4676 0.3076
0.7224 16.9811 7200 0.4659 0.3056
0.6838 17.2170 7300 0.4640 0.3043
0.6294 17.4528 7400 0.4623 0.3043
0.7538 17.6887 7500 0.4610 0.3054
0.6775 17.9245 7600 0.4604 0.3033
0.7451 18.1604 7700 0.4581 0.3011
0.664 18.3962 7800 0.4582 0.3008
0.6857 18.6321 7900 0.4550 0.3008
0.6746 18.8679 8000 0.4531 0.2983
0.6475 19.1038 8100 0.4540 0.3003
0.6577 19.3396 8200 0.4525 0.2973
0.7043 19.5755 8300 0.4510 0.2988
0.6841 19.8113 8400 0.4504 0.2963
0.6529 20.0472 8500 0.4498 0.2963
0.7497 20.2830 8600 0.4466 0.2940
0.6469 20.5189 8700 0.4457 0.2932
0.6383 20.7547 8800 0.4453 0.2922
0.6848 20.9906 8900 0.4441 0.2922
0.6669 21.2264 9000 0.4429 0.2930
0.6405 21.4623 9100 0.4417 0.2917
0.6643 21.6981 9200 0.4417 0.2915
0.6853 21.9340 9300 0.4383 0.2894
0.6366 22.1698 9400 0.4385 0.2864
0.6469 22.4057 9500 0.4381 0.2864
0.6843 22.6415 9600 0.4356 0.2872
0.642 22.8774 9700 0.4364 0.2864
0.7332 23.1132 9800 0.4338 0.2861
0.6393 23.3491 9900 0.4348 0.2864
0.6332 23.5849 10000 0.4324 0.2856
0.6038 23.8208 10100 0.4324 0.2851
0.6561 24.0566 10200 0.4311 0.2824
0.6515 24.2925 10300 0.4293 0.2826
0.6436 24.5283 10400 0.4285 0.2806
0.5898 24.7642 10500 0.4279 0.2808
0.6363 25.0 10600 0.4260 0.2801
0.6533 25.2358 10700 0.4254 0.2783
0.6512 25.4717 10800 0.4243 0.2788
0.6041 25.7075 10900 0.4235 0.2786
0.6423 25.9434 11000 0.4234 0.2770
0.5831 26.1792 11100 0.4225 0.2750
0.6029 26.4151 11200 0.4213 0.2738
0.6696 26.6509 11300 0.4210 0.2740
0.657 26.8868 11400 0.4202 0.2722
0.6323 27.1226 11500 0.4190 0.2743
0.6333 27.3585 11600 0.4180 0.2738
0.635 27.5943 11700 0.4175 0.2715
0.6309 27.8302 11800 0.4189 0.2715
0.6286 28.0660 11900 0.4182 0.2707
0.5783 28.3019 12000 0.4188 0.2682
0.627 28.5377 12100 0.4163 0.2707
0.6092 28.7736 12200 0.4148 0.2687
0.6429 29.0094 12300 0.4133 0.2695
0.6148 29.2453 12400 0.4128 0.2700
0.6757 29.4811 12500 0.4118 0.2685
0.6041 29.7170 12600 0.4124 0.2682
0.6144 29.9528 12700 0.4117 0.2682
0.5617 30.1887 12800 0.4100 0.2672
0.5981 30.4245 12900 0.4100 0.2674
0.6428 30.6604 13000 0.4089 0.2664
0.5913 30.8962 13100 0.4096 0.2652
0.6583 31.1321 13200 0.4078 0.2662
0.6472 31.3679 13300 0.4073 0.2664
0.5755 31.6038 13400 0.4061 0.2649
0.6032 31.8396 13500 0.4063 0.2639
0.5765 32.0755 13600 0.4062 0.2631
0.6069 32.3113 13700 0.4060 0.2624
0.6172 32.5472 13800 0.4055 0.2626
0.603 32.7830 13900 0.4044 0.2624
0.6575 33.0189 14000 0.4033 0.2614
0.6387 33.2547 14100 0.4023 0.2611
0.5852 33.4906 14200 0.4028 0.2614
0.5802 33.7264 14300 0.4017 0.2616
0.6124 33.9623 14400 0.4010 0.2604
0.6248 34.1981 14500 0.4006 0.2606
0.6169 34.4340 14600 0.4009 0.2599
0.564 34.6698 14700 0.4007 0.2581
0.5788 34.9057 14800 0.3997 0.2591
0.6364 35.1415 14900 0.3988 0.2581
0.6257 35.3774 15000 0.3985 0.2576
0.5516 35.6132 15100 0.3984 0.2588
0.6263 35.8491 15200 0.3963 0.2594
0.5153 36.0849 15300 0.3962 0.2578
0.5622 36.3208 15400 0.3960 0.2581
0.5238 36.5566 15500 0.3970 0.2566
0.6735 36.7925 15600 0.3957 0.2571
0.6307 37.0283 15700 0.3958 0.2573
0.6231 37.2642 15800 0.3945 0.2576
0.5458 37.5 15900 0.3931 0.2558
0.6256 37.7358 16000 0.3913 0.2563
0.5762 37.9717 16100 0.3912 0.2551
0.5254 38.2075 16200 0.3917 0.2548
0.6052 38.4434 16300 0.3912 0.2535
0.5942 38.6792 16400 0.3911 0.2538
0.5946 38.9151 16500 0.3902 0.2538
0.5595 39.1509 16600 0.3901 0.2530
0.5734 39.3868 16700 0.3899 0.2528
0.6011 39.6226 16800 0.3899 0.2528
0.6162 39.8585 16900 0.3886 0.2525
0.559 40.0943 17000 0.3885 0.2513
0.5645 40.3302 17100 0.3866 0.2513
0.6226 40.5660 17200 0.3871 0.2515
0.5847 40.8019 17300 0.3864 0.2505
0.5887 41.0377 17400 0.3861 0.2513
0.5798 41.2736 17500 0.3851 0.2515
0.5874 41.5094 17600 0.3846 0.25
0.5508 41.7453 17700 0.3849 0.2492
0.5838 41.9811 17800 0.3845 0.2490
0.5456 42.2170 17900 0.3831 0.2492
0.5564 42.4528 18000 0.3830 0.2480
0.5869 42.6887 18100 0.3824 0.2462
0.5954 42.9245 18200 0.3830 0.2482
0.6119 43.1604 18300 0.3812 0.2492
0.567 43.3962 18400 0.3812 0.2472
0.5582 43.6321 18500 0.3814 0.2462
0.5892 43.8679 18600 0.3810 0.2465
0.5518 44.1038 18700 0.3806 0.2472
0.5711 44.3396 18800 0.3807 0.2475
0.5688 44.5755 18900 0.3805 0.2475
0.5578 44.8113 19000 0.3806 0.2470
0.6209 45.0472 19100 0.3794 0.2467
0.5857 45.2830 19200 0.3804 0.2465
0.5627 45.5189 19300 0.3804 0.2467
0.5763 45.7547 19400 0.3788 0.2454
0.5156 45.9906 19500 0.3788 0.2462
0.5729 46.2264 19600 0.3787 0.2454
0.5487 46.4623 19700 0.3774 0.2444
0.5907 46.6981 19800 0.3777 0.2454
0.5644 46.9340 19900 0.3768 0.2452
0.5931 47.1698 20000 0.3767 0.2449
0.5478 47.4057 20100 0.3771 0.2429
0.5243 47.6415 20200 0.3771 0.2442
0.5685 47.8774 20300 0.3757 0.2419
0.5726 48.1132 20400 0.3754 0.2442
0.5173 48.3491 20500 0.3760 0.2439
0.5968 48.5849 20600 0.3765 0.2429
0.5777 48.8208 20700 0.3748 0.2401
0.5744 49.0566 20800 0.3750 0.2409
0.5613 49.2925 20900 0.3752 0.2414
0.5391 49.5283 21000 0.3744 0.2406
0.5792 49.7642 21100 0.3740 0.2419
0.5557 50.0 21200 0.3738 0.2424
0.5856 50.2358 21300 0.3736 0.2401
0.5424 50.4717 21400 0.3729 0.2412
0.5288 50.7075 21500 0.3729 0.2396
0.5527 50.9434 21600 0.3720 0.2394
0.615 51.1792 21700 0.3722 0.2389
0.5514 51.4151 21800 0.3719 0.2396
0.5393 51.6509 21900 0.3719 0.2396
0.5501 51.8868 22000 0.3714 0.2391
0.5798 52.1226 22100 0.3700 0.2396
0.5434 52.3585 22200 0.3702 0.2391
0.5552 52.5943 22300 0.3693 0.2401
0.546 52.8302 22400 0.3692 0.2394
0.5097 53.0660 22500 0.3695 0.2394
0.5282 53.3019 22600 0.3698 0.2396
0.5435 53.5377 22700 0.3692 0.2389
0.5104 53.7736 22800 0.3695 0.2394
0.5664 54.0094 22900 0.3685 0.2384
0.5502 54.2453 23000 0.3686 0.2389
0.558 54.4811 23100 0.3682 0.2396
0.5888 54.7170 23200 0.3679 0.2389
0.4898 54.9528 23300 0.3680 0.2379
0.6259 55.1887 23400 0.3676 0.2371
0.5581 55.4245 23500 0.3663 0.2386
0.5439 55.6604 23600 0.3672 0.2391
0.5254 55.8962 23700 0.3666 0.2391
0.5214 56.1321 23800 0.3659 0.2384
0.5219 56.3679 23900 0.3657 0.2391
0.564 56.6038 24000 0.3656 0.2399

Framework versions

  • Transformers 4.43.4
  • Pytorch 2.4.1
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
965M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for sqrk/COPAS-mms1ball-Nov29

Finetuned
(214)
this model