nllb-200-1.3B-ICFOSS_Tamil_Malayalam_Translator
This model is a fine-tuned version of facebook/nllb-200-1.3B on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7754
- Bleu: 29.8537
- Rouge: {'rouge1': 0.20075757575757575, 'rouge2': 0.09794372294372294, 'rougeL': 0.1974025974025974, 'rougeLsum': 0.19978354978354979}
- Chrf: {'score': 64.95358370898863, 'char_order': 6, 'word_order': 0, 'beta': 2}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Rouge | Chrf |
---|---|---|---|---|---|---|
0.9004 | 1.0 | 3806 | 0.8154 | 27.1742 | {'rouge1': 0.19772727272727275, 'rouge2': 0.09577922077922078, 'rougeL': 0.1945887445887446, 'rougeLsum': 0.19675324675324676} | {'score': 63.3462918203542, 'char_order': 6, 'word_order': 0, 'beta': 2} |
0.7946 | 2.0 | 7612 | 0.7844 | 28.0891 | {'rouge1': 0.2006395120031484, 'rouge2': 0.09577922077922078, 'rougeL': 0.19744195198740652, 'rougeLsum': 0.200137741046832} | {'score': 64.30399837657693, 'char_order': 6, 'word_order': 0, 'beta': 2} |
0.7549 | 3.0 | 11418 | 0.7754 | 29.3679 | {'rouge1': 0.20058048012593466, 'rouge2': 0.09794372294372293, 'rougeL': 0.1974025974025974, 'rougeLsum': 0.1996064541519087} | {'score': 64.84398894116521, 'char_order': 6, 'word_order': 0, 'beta': 2} |
0.7403 | 4.0 | 15224 | 0.7754 | 29.1387 | {'rouge1': 0.20058048012593466, 'rouge2': 0.09794372294372293, 'rougeL': 0.1974025974025974, 'rougeLsum': 0.1996064541519087} | {'score': 64.64187026525788, 'char_order': 6, 'word_order': 0, 'beta': 2} |
0.7374 | 5.0 | 19030 | 0.7754 | 29.8537 | {'rouge1': 0.20075757575757575, 'rouge2': 0.09794372294372294, 'rougeL': 0.1974025974025974, 'rougeLsum': 0.19978354978354979} | {'score': 64.95358370898863, 'char_order': 6, 'word_order': 0, 'beta': 2} |
Framework versions
- PEFT 0.10.0
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1
- Downloads last month
- 0
Model tree for ArunIcfoss/nllb-200-1.3B-ICFOSS_Tamil_Malayalam_Translator
Base model
facebook/nllb-200-1.3B