ninja's picture
End of training
1937bc0
|
raw
history blame
3.54 kB
metadata
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-en-ar
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: opus-mt-en-ar-finetuned-en-to-ar
    results: []

opus-mt-en-ar-finetuned-en-to-ar

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6900
  • Bleu: 16.0561
  • Gen Len: 17.5458

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
1.7613 1.0 2381 2.3308 15.763 21.341
1.6561 2.0 4762 2.3460 15.8584 18.3615
1.5261 3.0 7143 2.3583 15.9146 18.3606
1.4501 4.0 9524 2.3798 16.1441 17.475
1.3497 5.0 11905 2.4023 16.2252 17.4995
1.4105 6.0 14286 2.3787 16.0959 17.9816
1.3328 7.0 16667 2.3917 16.1331 17.6262
1.2937 8.0 19048 2.4080 16.3078 17.4534
1.2121 9.0 21429 2.4251 16.2201 17.6992
1.1539 10.0 23810 2.4435 16.3003 17.5033
1.1082 11.0 26191 2.4623 16.1261 17.5324
1.0687 12.0 28572 2.4776 16.2769 17.4889
1.0148 13.0 30953 2.5003 16.2253 17.482
0.9665 14.0 33334 2.5121 15.9708 17.7131
0.9399 15.0 35715 2.5320 16.1663 17.552
0.9002 16.0 38096 2.5455 16.0954 17.4487
0.8696 17.0 40477 2.5698 16.1255 17.4858
0.8382 18.0 42858 2.5831 16.0722 17.3738
0.8133 19.0 45239 2.6000 15.9531 17.6652
0.7738 20.0 47620 2.6132 16.077 17.5936
0.7624 21.0 50001 2.6288 16.0965 17.5484
0.7395 22.0 52382 2.6402 16.044 17.5031
0.7106 23.0 54763 2.6560 15.8598 17.5839
0.7001 24.0 57144 2.6605 15.9754 17.5711
0.6812 25.0 59525 2.6685 15.9742 17.526
0.6665 26.0 61906 2.6781 16.0116 17.4887
0.6594 27.0 64287 2.6834 15.9705 17.5014
0.6513 28.0 66668 2.6870 15.9401 17.5768
0.6329 29.0 69049 2.6893 16.0272 17.5373
0.6266 30.0 71430 2.6900 16.0561 17.5458

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1
  • Datasets 2.14.5
  • Tokenizers 0.14.1