git-base-pokemon / README.md
tbooy's picture
End of training
65271ea verified
metadata
library_name: transformers
license: mit
base_model: microsoft/git-base
tags:
  - generated_from_trainer
model-index:
  - name: git-base-pokemon
    results: []

git-base-pokemon

This model is a fine-tuned version of microsoft/git-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0344
  • Wer Score: 2.2807

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Score
7.2943 4.1667 50 4.4708 21.1278
2.2992 8.3333 100 0.4252 14.8722
0.1287 12.5 150 0.0311 0.6128
0.0161 16.6667 200 0.0280 2.4762
0.0049 20.8333 250 0.0304 2.4561
0.0022 25.0 300 0.0327 2.4085
0.0016 29.1667 350 0.0328 2.3333
0.0013 33.3333 400 0.0333 2.4298
0.0012 37.5 450 0.0341 2.3033
0.0011 41.6667 500 0.0344 2.2569
0.001 45.8333 550 0.0344 2.2744
0.001 50.0 600 0.0344 2.2807

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1