metadata
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-rvl-cdip
results: []
distilbert-base-uncased-rvl-cdip
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2055
- Accuracy: 0.7079
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 1.0 | 48 | 2.4392 | 0.45 |
No log | 2.0 | 96 | 1.5529 | 0.5632 |
No log | 3.0 | 144 | 1.3164 | 0.6132 |
No log | 4.0 | 192 | 1.1269 | 0.6395 |
No log | 5.0 | 240 | 1.0145 | 0.7 |
No log | 6.0 | 288 | 1.0839 | 0.6816 |
No log | 7.0 | 336 | 1.1414 | 0.6868 |
No log | 8.0 | 384 | 1.1220 | 0.7053 |
No log | 9.0 | 432 | 1.1402 | 0.7105 |
No log | 10.0 | 480 | 1.1805 | 0.7132 |
0.8154 | 11.0 | 528 | 1.1923 | 0.7132 |
0.8154 | 12.0 | 576 | 1.2007 | 0.7079 |
0.8154 | 13.0 | 624 | 1.1973 | 0.7079 |
0.8154 | 14.0 | 672 | 1.2049 | 0.7105 |
0.8154 | 15.0 | 720 | 1.2055 | 0.7079 |
Framework versions
- Transformers 4.36.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0