cherifkhalifah's picture
Upload tokenizer
fe093c2 verified
metadata
base_model: Helsinki-NLP/opus-mt-en-ar
license: apache-2.0
metrics:
  - bleu
tags:
  - generated_from_trainer
model-index:
  - name: Tounsify-v0.9-shuffle
    results: []

Tounsify-v0.9-shuffle

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2171
  • Bleu: 47.287
  • Gen Len: 9.1774

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 62 2.2319 11.5922 8.3226
No log 2.0 124 1.4979 22.8539 8.3871
No log 3.0 186 1.1749 31.2278 8.5323
No log 4.0 248 1.0500 39.4966 8.7097
No log 5.0 310 0.9562 42.3858 8.7742
No log 6.0 372 0.9306 43.1436 8.6935
No log 7.0 434 0.8928 42.3849 8.8387
No log 8.0 496 0.9243 42.8107 8.8548
0.9876 9.0 558 0.9293 44.3329 8.8548
0.9876 10.0 620 0.9398 42.859 8.871
0.9876 11.0 682 0.9637 44.6861 8.8548
0.9876 12.0 744 0.9514 45.1661 8.8387
0.9876 13.0 806 0.9780 45.5317 8.8226
0.9876 14.0 868 0.9832 48.3237 8.8548
0.9876 15.0 930 0.9618 49.9886 9.0484
0.9876 16.0 992 0.9980 47.1846 8.9516
0.0522 17.0 1054 0.9758 45.6558 8.9839
0.0522 18.0 1116 0.9907 45.325 9.0
0.0522 19.0 1178 1.0234 48.1955 8.9194
0.0522 20.0 1240 1.0339 47.0583 8.9839
0.0522 21.0 1302 1.0129 49.2604 8.8871
0.0522 22.0 1364 1.0407 49.847 8.8871
0.0522 23.0 1426 1.0656 48.4962 8.9839
0.0522 24.0 1488 1.0504 48.3458 8.9839
0.0153 25.0 1550 1.0556 49.455 9.0161
0.0153 26.0 1612 1.0522 48.9644 9.0323
0.0153 27.0 1674 1.0793 48.7056 8.9839
0.0153 28.0 1736 1.0859 48.8805 8.9839
0.0153 29.0 1798 1.1362 48.306 9.0806
0.0153 30.0 1860 1.0573 51.8905 9.2097
0.0153 31.0 1922 1.1220 48.3591 9.0806
0.0153 32.0 1984 1.0879 49.0288 9.129
0.0097 33.0 2046 1.1219 50.593 9.1129
0.0097 34.0 2108 1.1439 49.1391 9.0
0.0097 35.0 2170 1.1265 50.5195 9.0323
0.0097 36.0 2232 1.1031 50.2673 9.0806
0.0097 37.0 2294 1.1418 51.3256 8.9839
0.0097 38.0 2356 1.1419 50.8617 9.0968
0.0097 39.0 2418 1.1166 51.2853 9.1452
0.0097 40.0 2480 1.1309 50.6103 9.0806
0.0082 41.0 2542 1.1501 50.7017 9.0
0.0082 42.0 2604 1.1108 51.6167 9.0806
0.0082 43.0 2666 1.1176 51.1365 9.0968
0.0082 44.0 2728 1.1544 49.703 9.0645
0.0082 45.0 2790 1.1655 51.432 9.1935
0.0082 46.0 2852 1.1460 50.1011 9.1774
0.0082 47.0 2914 1.1377 50.0643 9.129
0.0082 48.0 2976 1.1406 50.1912 9.1129
0.0081 49.0 3038 1.1452 47.2465 9.1774
0.0081 50.0 3100 1.1532 49.9986 9.0806
0.0081 51.0 3162 1.1596 47.8461 9.0806
0.0081 52.0 3224 1.1643 48.3596 9.0968
0.0081 53.0 3286 1.1577 47.1237 9.0806
0.0081 54.0 3348 1.1599 48.6692 9.0968
0.0081 55.0 3410 1.1613 48.1806 9.0806
0.0081 56.0 3472 1.1668 47.5471 9.1613
0.0069 57.0 3534 1.1749 50.0805 9.0806
0.0069 58.0 3596 1.1784 49.3841 9.1774
0.0069 59.0 3658 1.1666 49.4183 9.0645
0.0069 60.0 3720 1.1768 47.8488 9.1774
0.0069 61.0 3782 1.1908 48.7428 9.0968
0.0069 62.0 3844 1.1882 49.2957 8.9677
0.0069 63.0 3906 1.1869 49.5255 9.0323
0.0069 64.0 3968 1.1866 48.8917 9.0161
0.0068 65.0 4030 1.1858 48.5308 9.0968
0.0068 66.0 4092 1.1951 49.2041 9.0806
0.0068 67.0 4154 1.1828 49.1255 9.0806
0.0068 68.0 4216 1.1923 48.0252 9.0484
0.0068 69.0 4278 1.1947 48.0764 9.1129
0.0068 70.0 4340 1.1927 48.2729 9.0484
0.0068 71.0 4402 1.1907 47.9908 9.129
0.0068 72.0 4464 1.1920 48.8939 9.0968
0.0062 73.0 4526 1.1939 49.0374 9.0968
0.0062 74.0 4588 1.1952 49.0374 9.0968
0.0062 75.0 4650 1.1954 49.2333 9.0323
0.0062 76.0 4712 1.1951 48.3221 9.1129
0.0062 77.0 4774 1.1971 48.3221 9.1129
0.0062 78.0 4836 1.1978 49.5615 9.1129
0.0062 79.0 4898 1.1994 48.947 9.0484
0.0062 80.0 4960 1.2009 48.0436 9.0806
0.0045 81.0 5022 1.2021 47.9908 9.129
0.0045 82.0 5084 1.2048 47.9908 9.129
0.0045 83.0 5146 1.2045 49.5615 9.0968
0.0045 84.0 5208 1.2065 49.4183 9.0968
0.0045 85.0 5270 1.2081 48.9864 9.0968
0.0045 86.0 5332 1.2131 46.327 9.0968
0.0045 87.0 5394 1.2144 47.2291 9.1452
0.0045 88.0 5456 1.2135 47.2291 9.1452
0.0047 89.0 5518 1.2163 46.8533 9.1452
0.0047 90.0 5580 1.2207 47.3713 9.1452
0.0047 91.0 5642 1.2188 47.3713 9.1452
0.0047 92.0 5704 1.2193 47.3713 9.1452
0.0047 93.0 5766 1.2188 48.9917 9.1452
0.0047 94.0 5828 1.2175 47.2291 9.1452
0.0047 95.0 5890 1.2177 48.9917 9.1452
0.0047 96.0 5952 1.2177 47.3713 9.1452
0.0043 97.0 6014 1.2165 47.3713 9.1452
0.0043 98.0 6076 1.2167 47.287 9.1774
0.0043 99.0 6138 1.2169 47.287 9.1774
0.0043 100.0 6200 1.2171 47.287 9.1774

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1