xlm-roberta-large_ALL_BCE_new_data_multihead_19_shuffled_special_tokens_final
This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4169
- F1 Macro 0.1: 0.0608
- F1 Macro 0.15: 0.0628
- F1 Macro 0.2: 0.0639
- F1 Macro 0.25: 0.0677
- F1 Macro 0.3: 0.0681
- F1 Macro 0.35: 0.0723
- F1 Macro 0.4: 0.0790
- F1 Macro 0.45: 0.0818
- F1 Macro 0.5: 0.0874
- F1 Macro 0.55: 0.0904
- F1 Macro 0.6: 0.0771
- F1 Macro 0.65: 0.0611
- F1 Macro 0.7: 0.0421
- F1 Macro 0.75: 0.0235
- F1 Macro 0.8: 0.0
- F1 Macro 0.85: 0.0
- F1 Macro 0.9: 0.0
- F1 Macro 0.95: 0.0
- Threshold 0: 0.45
- Threshold 1: 0.5
- Threshold 2: 0.7
- Threshold 3: 0.35
- Threshold 4: 0.6
- Threshold 5: 0.65
- Threshold 6: 0.55
- Threshold 7: 0.45
- Threshold 8: 0.55
- Threshold 9: 0.5
- Threshold 10: 0.5
- Threshold 11: 0.6
- Threshold 12: 0.4
- Threshold 13: 0.1
- Threshold 14: 0.45
- Threshold 15: 0.55
- Threshold 16: 0.55
- Threshold 17: 0.5
- Threshold 18: 0.35
- 0: 0.0476
- 1: 0.0951
- 2: 0.1069
- 3: 0.0416
- 4: 0.1579
- 5: 0.1767
- 6: 0.1486
- 7: 0.0558
- 8: 0.0742
- 9: 0.2208
- 10: 0.0532
- 11: 0.1499
- 12: 0.0799
- 13: 0.0095
- 14: 0.0968
- 15: 0.0679
- 16: 0.1100
- 17: 0.0621
- 18: 0.0296
- Max F1: 0.0904
- Mean F1: 0.0939
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 2024
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro 0.1 | F1 Macro 0.15 | F1 Macro 0.2 | F1 Macro 0.25 | F1 Macro 0.3 | F1 Macro 0.35 | F1 Macro 0.4 | F1 Macro 0.45 | F1 Macro 0.5 | F1 Macro 0.55 | F1 Macro 0.6 | F1 Macro 0.65 | F1 Macro 0.7 | F1 Macro 0.75 | F1 Macro 0.8 | F1 Macro 0.85 | F1 Macro 0.9 | F1 Macro 0.95 | Threshold 0 | Threshold 1 | Threshold 2 | Threshold 3 | Threshold 4 | Threshold 5 | Threshold 6 | Threshold 7 | Threshold 8 | Threshold 9 | Threshold 10 | Threshold 11 | Threshold 12 | Threshold 13 | Threshold 14 | Threshold 15 | Threshold 16 | Threshold 17 | Threshold 18 | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | Max F1 | Mean F1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.4193 | 1.0 | 7458 | 1.4534 | 0.0606 | 0.0634 | 0.0648 | 0.0667 | 0.0689 | 0.0718 | 0.0765 | 0.0771 | 0.0811 | 0.0839 | 0.0856 | 0.0772 | 0.0562 | 0.0434 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4 | 0.55 | 0.6 | 0.3 | 0.55 | 0.7 | 0.55 | 0.55 | 0.6 | 0.5 | 0.4 | 0.6 | 0.6 | 0.1 | 0.5 | 0.55 | 0.55 | 0.5 | 0.35 | 0.0376 | 0.0951 | 0.1069 | 0.0416 | 0.1579 | 0.1767 | 0.1486 | 0.0558 | 0.0742 | 0.2208 | 0.0532 | 0.1499 | 0.0799 | 0.0095 | 0.0968 | 0.0655 | 0.1053 | 0.0621 | 0.0296 | 0.0856 | 0.0930 |
1.4423 | 2.0 | 14916 | 1.4217 | 0.0608 | 0.0627 | 0.0641 | 0.0673 | 0.0693 | 0.0732 | 0.0783 | 0.0835 | 0.0829 | 0.0861 | 0.0793 | 0.0621 | 0.0363 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6 | 0.45 | 0.65 | 0.35 | 0.5 | 0.65 | 0.55 | 0.55 | 0.6 | 0.55 | 0.25 | 0.55 | 0.45 | 0.15 | 0.45 | 0.55 | 0.5 | 0.55 | 0.35 | 0.0476 | 0.0951 | 0.1069 | 0.0416 | 0.1569 | 0.1767 | 0.1486 | 0.0538 | 0.0742 | 0.2204 | 0.0520 | 0.1455 | 0.0799 | 0.0095 | 0.0968 | 0.0679 | 0.1100 | 0.0621 | 0.0270 | 0.0861 | 0.0933 |
1.4186 | 3.0 | 22374 | 1.4169 | 0.0608 | 0.0628 | 0.0639 | 0.0677 | 0.0681 | 0.0723 | 0.0790 | 0.0818 | 0.0874 | 0.0904 | 0.0771 | 0.0611 | 0.0421 | 0.0235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.45 | 0.5 | 0.7 | 0.35 | 0.6 | 0.65 | 0.55 | 0.45 | 0.55 | 0.5 | 0.5 | 0.6 | 0.4 | 0.1 | 0.45 | 0.55 | 0.55 | 0.5 | 0.35 | 0.0476 | 0.0951 | 0.1069 | 0.0416 | 0.1579 | 0.1767 | 0.1486 | 0.0558 | 0.0742 | 0.2208 | 0.0532 | 0.1499 | 0.0799 | 0.0095 | 0.0968 | 0.0679 | 0.1100 | 0.0621 | 0.0296 | 0.0904 | 0.0939 |
Framework versions
- Transformers 4.36.1
- Pytorch 2.1.0+cu121
- Datasets 2.13.1
- Tokenizers 0.15.0
- Downloads last month
- 9
Model tree for christinacdl/xlm-roberta-large_ALL_BCE_new_data_multihead_19_shuffled_special_tokens_final
Base model
FacebookAI/xlm-roberta-large