_base_nougat_logs

This model is a fine-tuned version of facebook/nougat-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4360

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 6
  • total_train_batch_size: 48
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss
2.0261 0.9901 83 1.9705
1.8445 1.9920 167 1.7617
1.6923 2.9940 251 1.6539
1.597 3.9960 335 1.5871
1.5287 4.9980 419 1.5196
1.46 6.0 503 1.4804
1.3647 6.9901 586 1.4360
1.289 7.9920 670 1.3772
1.1741 8.9940 754 1.2350
0.9947 9.9960 838 1.0415
0.7889 10.9980 922 0.9238
0.6771 12.0 1006 0.7884
0.6256 12.9901 1089 0.6646
0.5402 13.9920 1173 0.6095
0.5252 14.9940 1257 0.5702
0.441 15.9960 1341 0.5282
0.4077 16.9980 1425 0.5030
0.3841 18.0 1509 0.4855
0.3762 18.9901 1592 0.4703
0.3611 19.9920 1676 0.4587
0.3486 20.9940 1760 0.4486
0.3679 21.9960 1844 0.4416
0.3356 22.9980 1928 0.4400
0.3343 24.0 2012 0.4387
0.3229 24.9901 2095 0.4410
0.2928 25.9920 2179 0.4377
0.3042 26.9940 2263 0.4393
0.3439 27.9960 2347 0.4353
0.3286 28.9980 2431 0.4365
0.353 29.7018 2490 0.4360

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.20.3
Downloads last month
18
Safetensors
Model size
349M params
Tensor type
I64
·
BF16
·
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for bustamiyusoef/_base_nougat_logs

Finetuned
(2)
this model