--- datasets: - GroNLP/dutch-cola language: - nl metrics: - name: accuracy value: pipeline_tag: text-classification --- # Model Card for Model ID This model is a fine-tuned version of (https://huggingface.co/yhavinga/gpt2-medium-dutch) on (https://huggingface.co/datasets/GroNLP/dutch-cola). It achieves the following results on the evaluation set: - Loss: 0.519 - Accuracy: 0.7613 ### Model Description More information needed ## Training Details ### Training Data [More Information Needed] ### Training Procedure #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters The following hyperparameters were used during training: - learning rate: 4e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: AdamW with lr=4e-05, weight_decay=0.01, betas=(o.9, 0.999) and epsilon=1e-08 - num_epochs=3 - fp16=True - gradient_acc_step=1 ## Evaluation ### Training results [More Information Needed]