exp2-led-risalah_data_v3

This model is a fine-tuned version of silmi224/finetune-led-35000 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9287
  • Rouge1: 16.3563
  • Rouge2: 6.3361
  • Rougel: 10.2361
  • Rougelsum: 15.4499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 150
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
3.3696 1.0 10 2.9032 8.9827 2.4864 6.5741 8.374
3.3479 2.0 20 2.8646 9.4368 2.6548 6.6897 8.9073
3.2858 3.0 30 2.8050 8.3204 2.4233 6.4571 7.8334
3.204 4.0 40 2.7299 7.9763 2.7995 6.1867 7.5793
3.0987 5.0 50 2.6458 9.4672 2.877 7.2221 8.8929
2.9964 6.0 60 2.5576 9.3123 2.635 6.8591 8.8136
2.8831 7.0 70 2.4682 9.8347 2.8621 7.3463 9.346
2.7834 8.0 80 2.3818 9.756 2.6064 7.3736 9.0638
2.6712 9.0 90 2.3005 10.6798 3.5515 7.9318 9.5388
2.5781 10.0 100 2.2261 11.4114 3.5141 8.0732 10.6929
2.4807 11.0 110 2.1623 12.9396 4.3079 9.1668 11.7355
2.403 12.0 120 2.1101 13.27 4.7477 9.0288 12.277
2.3358 13.0 130 2.0644 15.1784 5.3452 10.1318 13.8506
2.2701 14.0 140 2.0249 14.1959 5.2981 10.2128 12.8727
2.2032 15.0 150 1.9925 14.4716 5.5627 9.58 13.7089
2.1608 16.0 160 1.9685 14.2815 5.9009 9.516 13.4755
2.1338 17.0 170 1.9509 15.6523 6.3449 10.2105 14.9489
2.104 18.0 180 1.9383 16.3987 7.0987 10.8261 15.8296
2.0896 19.0 190 1.9308 16.0883 6.3808 10.0722 15.17
2.0758 20.0 200 1.9287 16.3563 6.3361 10.2361 15.4499

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.1.2
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
162M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for silmi224/exp2-led-risalah_data_v3

Finetuned
(15)
this model