utakumi's picture
End of training
ebd501d verified
metadata
library_name: transformers
language:
  - ja
license: apache-2.0
base_model: rinna/japanese-hubert-base
tags:
  - automatic-speech-recognition
  - mozilla-foundation/common_voice_13_0
  - generated_from_trainer
datasets:
  - common_voice_13_0
metrics:
  - wer
model-index:
  - name: Hubert-common_voice-ja-demo-japanese-debug-cosine
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: MOZILLA-FOUNDATION/COMMON_VOICE_13_0 - JA
          type: common_voice_13_0
          config: ja
          split: test
          args: 'Config: ja, Training split: train+validation, Eval split: test'
        metrics:
          - name: Wer
            type: wer
            value: 1.946033024567056

Hubert-common_voice-ja-demo-japanese-debug-cosine

This model is a fine-tuned version of rinna/japanese-hubert-base on the MOZILLA-FOUNDATION/COMMON_VOICE_13_0 - JA dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3645
  • Wer: 1.9460
  • Cer: 0.5340

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 12500
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 0.2660 100 85.6894 1.0085 8.6671
No log 0.5319 200 84.9253 1.0149 8.6014
No log 0.7979 300 82.9367 1.0159 6.5921
No log 1.0638 400 75.8485 1.0 0.9860
68.3352 1.3298 500 62.8321 1.0 0.9859
68.3352 1.5957 600 57.6763 1.0 0.9859
68.3352 1.8617 700 56.3682 1.0 0.9859
68.3352 2.1277 800 55.7325 1.0 0.9859
68.3352 2.3936 900 55.0750 1.0 0.9859
48.4779 2.6596 1000 54.3879 1.0 0.9859
48.4779 2.9255 1100 53.6522 1.0 0.9859
48.4779 3.1915 1200 52.8021 1.0 0.9859
48.4779 3.4574 1300 51.9015 1.0 0.9859
48.4779 3.7234 1400 50.9048 1.0 0.9859
44.647 3.9894 1500 49.8562 1.0 0.9859
44.647 4.2553 1600 48.6802 1.0 0.9859
44.647 4.5213 1700 47.4341 1.0 0.9860
44.647 4.7872 1800 46.0720 1.0 0.9859
44.647 5.0532 1900 44.6128 1.0 0.9859
39.9761 5.3191 2000 43.0685 1.0 0.9859
39.9761 5.5851 2100 41.3959 1.0 0.9859
39.9761 5.8511 2200 39.6261 1.0 0.9859
39.9761 6.1170 2300 37.7607 1.0 0.9859
39.9761 6.3830 2400 35.7686 1.0 0.9859
33.1415 6.6489 2500 33.6543 1.0 0.9859
33.1415 6.9149 2600 31.4395 1.0 0.9859
33.1415 7.1809 2700 29.1500 1.0 0.9859
33.1415 7.4468 2800 26.7739 1.0 0.9859
33.1415 7.7128 2900 24.3323 1.0 0.9859
24.1405 7.9787 3000 21.8788 1.0 0.9861
24.1405 8.2447 3100 19.3927 1.0 0.9859
24.1405 8.5106 3200 17.0046 1.0 0.9859
24.1405 8.7766 3300 14.7217 1.0 0.9859
24.1405 9.0426 3400 12.6468 1.0 0.9860
14.1936 9.3085 3500 10.8049 1.0 0.9859
14.1936 9.5745 3600 9.2678 1.0 0.9859
14.1936 9.8404 3700 8.0368 1.0 0.9860
14.1936 10.1064 3800 7.1212 1.0 0.9860
14.1936 10.3723 3900 6.4852 1.0 0.9859
7.4544 10.6383 4000 6.0728 1.0 0.9859
7.4544 10.9043 4100 5.8352 1.0 0.9859
7.4544 11.1702 4200 5.7644 1.0 0.9859
7.4544 11.4362 4300 5.6726 1.0 0.9859
7.4544 11.7021 4400 5.6452 1.0 0.9860
5.7484 11.9681 4500 5.5683 1.0 0.9859
5.7484 12.2340 4600 5.5095 1.0 0.9859
5.7484 12.5 4700 5.4579 1.0 0.9859
5.7484 12.7660 4800 5.4164 1.0 0.9859
5.7484 13.0319 4900 5.3754 1.0 0.9859
5.466 13.2979 5000 5.3542 1.0008 0.9859
5.466 13.5638 5100 5.3131 1.1907 0.9857
5.466 13.8298 5200 5.2739 1.5616 0.9856
5.466 14.0957 5300 5.2416 1.2078 0.9860
5.466 14.3617 5400 5.2095 1.3713 0.9860
5.257 14.6277 5500 5.1721 1.5914 0.9858
5.257 14.8936 5600 5.1435 1.3063 0.9857
5.257 15.1596 5700 5.1101 1.5610 0.9857
5.257 15.4255 5800 5.0661 1.7831 0.9861
5.257 15.6915 5900 5.0305 1.7904 0.9862
5.0645 15.9574 6000 4.9771 1.6460 0.9852
5.0645 16.2234 6100 4.9207 1.4819 0.9849
5.0645 16.4894 6200 4.8480 1.3870 0.9846
5.0645 16.7553 6300 4.7647 1.5983 0.9794
5.0645 17.0213 6400 4.6895 1.6639 0.9769
4.7633 17.2872 6500 4.6069 1.6518 0.9685
4.7633 17.5532 6600 4.4924 1.7159 0.9643
4.7633 17.8191 6700 4.3929 1.7851 0.9537
4.7633 18.0851 6800 4.2750 1.8838 0.9071
4.7633 18.3511 6900 4.1479 1.9080 0.8752
4.298 18.6170 7000 4.0495 1.9148 0.8506
4.298 18.8830 7100 3.9533 1.9289 0.8382
4.298 19.1489 7200 3.8649 1.9211 0.8414
4.298 19.4149 7300 3.7631 1.9229 0.8150
4.298 19.6809 7400 3.6881 1.9156 0.8070
3.8085 19.9468 7500 3.6324 1.9325 0.8051
3.8085 20.2128 7600 3.5423 1.9184 0.7772
3.8085 20.4787 7700 3.4706 1.9229 0.7731
3.8085 20.7447 7800 3.4111 1.9084 0.7712
3.8085 21.0106 7900 3.3576 1.9211 0.7663
3.4397 21.2766 8000 3.2964 1.9219 0.7642
3.4397 21.5426 8100 3.2338 1.9184 0.7542
3.4397 21.8085 8200 3.2086 1.9388 0.7627
3.4397 22.0745 8300 3.1352 1.9172 0.7449
3.4397 22.3404 8400 3.1061 1.9388 0.7425
3.1352 22.6064 8500 3.0486 1.9084 0.7352
3.1352 22.8723 8600 2.9802 1.9221 0.7325
3.1352 23.1383 8700 2.9792 1.9652 0.7435
3.1352 23.4043 8800 2.9154 1.9333 0.7334
3.1352 23.6702 8900 2.8665 1.9783 0.7203
2.8479 23.9362 9000 2.8167 1.9569 0.7277
2.8479 24.2021 9100 2.7657 1.9364 0.7062
2.8479 24.4681 9200 2.7301 1.9807 0.7037
2.8479 24.7340 9300 2.6957 1.9501 0.6956
2.8479 25.0 9400 2.6688 1.9851 0.6947
2.5799 25.2660 9500 2.6196 1.9869 0.6925
2.5799 25.5319 9600 2.5810 1.9917 0.6914
2.5799 25.7979 9700 2.5493 1.9895 0.6860
2.5799 26.0638 9800 2.5001 1.9837 0.6787
2.5799 26.3298 9900 2.4738 1.9750 0.6767
2.3389 26.5957 10000 2.4752 1.9829 0.6685
2.3389 26.8617 10100 2.4339 1.9698 0.6659
2.3389 27.1277 10200 2.3766 1.9909 0.6602
2.3389 27.3936 10300 2.3440 1.9905 0.6588
2.3389 27.6596 10400 2.3124 1.9911 0.6591
2.1275 27.9255 10500 2.2800 1.9917 0.6543
2.1275 28.1915 10600 2.2481 1.9497 0.6418
2.1275 28.4574 10700 2.2140 1.9857 0.6396
2.1275 28.7234 10800 2.1964 1.9889 0.6345
2.1275 28.9894 10900 2.1698 1.9905 0.6459
1.9256 29.2553 11000 2.1353 1.9925 0.6352
1.9256 29.5213 11100 2.1014 1.9905 0.6392
1.9256 29.7872 11200 2.0988 1.9917 0.6413
1.9256 30.0532 11300 2.0646 1.9917 0.6259
1.9256 30.3191 11400 2.0163 1.9899 0.6277
1.7455 30.5851 11500 1.9995 1.9895 0.6217
1.7455 30.8511 11600 1.9474 1.9887 0.6199
1.7455 31.1170 11700 1.9520 1.9865 0.6122
1.7455 31.3830 11800 1.9263 1.9883 0.6086
1.7455 31.6489 11900 1.9068 1.9881 0.6072
1.5853 31.9149 12000 1.8782 1.9875 0.6112
1.5853 32.1809 12100 1.8634 1.9869 0.6092
1.5853 32.4468 12200 1.8379 1.9847 0.5918
1.5853 32.7128 12300 1.8158 1.9853 0.6022
1.5853 32.9787 12400 1.7806 1.9861 0.5930
1.4359 33.2447 12500 1.7551 1.9823 0.5925
1.4359 33.5106 12600 1.7456 1.9827 0.5923
1.4359 33.7766 12700 1.7167 1.9774 0.5796
1.4359 34.0426 12800 1.7160 1.9789 0.5743
1.4359 34.3085 12900 1.7003 1.9766 0.5821
1.3053 34.5745 13000 1.6828 1.9758 0.5650
1.3053 34.8404 13100 1.6581 1.9750 0.5748
1.3053 35.1064 13200 1.6576 1.9730 0.5659
1.3053 35.3723 13300 1.6066 1.9708 0.5649
1.3053 35.6383 13400 1.6108 1.9714 0.5658
1.1806 35.9043 13500 1.6060 1.9732 0.5596
1.1806 36.1702 13600 1.5894 1.9756 0.5651
1.1806 36.4362 13700 1.5770 1.9710 0.5585
1.1806 36.7021 13800 1.5784 1.9706 0.5614
1.1806 36.9681 13900 1.5618 1.9702 0.5567
1.0758 37.2340 14000 1.5304 1.9680 0.5560
1.0758 37.5 14100 1.5293 1.9662 0.5488
1.0758 37.7660 14200 1.5076 1.9670 0.5478
1.0758 38.0319 14300 1.4922 1.9625 0.5493
1.0758 38.2979 14400 1.4985 1.9621 0.5491
0.9784 38.5638 14500 1.4935 1.9627 0.5509
0.9784 38.8298 14600 1.4927 1.9636 0.5503
0.9784 39.0957 14700 1.5040 1.9631 0.5491
0.9784 39.3617 14800 1.4626 1.9593 0.5422
0.9784 39.6277 14900 1.4509 1.9629 0.5418
0.9014 39.8936 15000 1.4410 1.9555 0.5420
0.9014 40.1596 15100 1.4515 1.9585 0.5469
0.9014 40.4255 15200 1.4180 1.9579 0.5449
0.9014 40.6915 15300 1.4222 1.9565 0.5386
0.9014 40.9574 15400 1.4090 1.9555 0.5416
0.8296 41.2234 15500 1.4069 1.9529 0.5403
0.8296 41.4894 15600 1.4077 1.9527 0.5362
0.8296 41.7553 15700 1.4173 1.9563 0.5406
0.8296 42.0213 15800 1.4264 1.9593 0.5431
0.8296 42.2872 15900 1.4124 1.9557 0.5376
0.7808 42.5532 16000 1.4008 1.9547 0.5363
0.7808 42.8191 16100 1.4079 1.9533 0.5366
0.7808 43.0851 16200 1.3887 1.9509 0.5405
0.7808 43.3511 16300 1.3935 1.9513 0.5390
0.7808 43.6170 16400 1.3829 1.9531 0.5399
0.7318 43.8830 16500 1.3820 1.9515 0.5324
0.7318 44.1489 16600 1.3826 1.9519 0.5359
0.7318 44.4149 16700 1.3815 1.9499 0.5343
0.7318 44.6809 16800 1.3834 1.9489 0.5295
0.7318 44.9468 16900 1.3662 1.9468 0.5365
0.697 45.2128 17000 1.3707 1.9487 0.5348
0.697 45.4787 17100 1.3659 1.9478 0.5358
0.697 45.7447 17200 1.3755 1.9482 0.5380
0.697 46.0106 17300 1.3710 1.9480 0.5373
0.697 46.2766 17400 1.3653 1.9462 0.5317
0.6792 46.5426 17500 1.3715 1.9478 0.5314
0.6792 46.8085 17600 1.3711 1.9468 0.5311
0.6792 47.0745 17700 1.3691 1.9484 0.5368
0.6792 47.3404 17800 1.3693 1.9480 0.5379
0.6792 47.6064 17900 1.3689 1.9456 0.5343
0.6549 47.8723 18000 1.3651 1.9456 0.5336
0.6549 48.1383 18100 1.3628 1.9474 0.5331
0.6549 48.4043 18200 1.3662 1.9466 0.5346
0.6549 48.6702 18300 1.3633 1.9464 0.5339
0.6549 48.9362 18400 1.3643 1.9452 0.5342
0.6521 49.2021 18500 1.3636 1.9470 0.5337
0.6521 49.4681 18600 1.3633 1.9472 0.5335
0.6521 49.7340 18700 1.3638 1.9468 0.5340
0.6521 50.0 18800 1.3636 1.9456 0.5335

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3