File size: 1,195 Bytes
2bff69e 079bd71 2bff69e 079bd71 6d54fd4 32fa71c 079bd71 32fa71c 079bd71 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
datasets:
- GroNLP/dutch-cola
language:
- nl
tags:
- generated from trainer
metrics:
- accuracy
base-model: yhavinga/gpt2-medium-dutch
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This model is a fine-tuned version of [yhavinga/gpt2-medium-dutch](https://huggingface.co/yhavinga/gpt2-medium-dutch) on [GroNLP/dutch-cola](https://huggingface.co/datasets/GroNLP/dutch-cola).
It achieves the following results on the evaluation set:
- Loss: 0.519
- Accuracy: 0.7613
## Training Details
#### Training Hyperparameters
The following hyperparameters were used during training:
- learning rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: AdamW with lr=4e-05, weight_decay=0.01, betas=(o.9, 0.999) and epsilon=1e-08
- num_epochs=3
- fp16=True
- gradient_acc_step=1
## Evaluation
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5648 | 1.0 | 1244 | 0.5695 | 0.7192 |
| 0.3399 | 2.0 | 2488 | 0.5190 | 0.7613 |
| 0.1779 | 3.0 | 3732 | 0.7269 | 0.7625 | |