silmi224 commited on
Commit
242dc96
1 Parent(s): de13d00

Training complete

Browse files
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: silmi224/finetune-led-35000
3
+ tags:
4
+ - summarization
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: exp2-led-risalah_data_v2
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # exp2-led-risalah_data_v2
17
+
18
+ This model is a fine-tuned version of [silmi224/finetune-led-35000](https://huggingface.co/silmi224/finetune-led-35000) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.6223
21
+ - Rouge1: 20.4859
22
+ - Rouge2: 10.2651
23
+ - Rougel: 14.7662
24
+ - Rougelsum: 19.2553
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 2e-05
44
+ - train_batch_size: 1
45
+ - eval_batch_size: 1
46
+ - seed: 42
47
+ - gradient_accumulation_steps: 8
48
+ - total_train_batch_size: 8
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_steps: 300
52
+ - num_epochs: 30
53
+ - mixed_precision_training: Native AMP
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
58
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
59
+ | 3.3339 | 1.0 | 10 | 2.8010 | 8.3493 | 2.4084 | 6.4284 | 7.9202 |
60
+ | 3.1015 | 2.0 | 20 | 2.5436 | 8.9461 | 2.3615 | 6.7822 | 8.3767 |
61
+ | 2.779 | 3.0 | 30 | 2.2976 | 11.5444 | 3.5251 | 8.0258 | 10.4456 |
62
+ | 2.5118 | 4.0 | 40 | 2.1282 | 13.3666 | 4.1766 | 9.2522 | 11.9858 |
63
+ | 2.3057 | 5.0 | 50 | 2.0147 | 15.021 | 5.5582 | 10.3573 | 14.1171 |
64
+ | 2.1541 | 6.0 | 60 | 1.9283 | 15.937 | 6.8169 | 11.0627 | 14.6866 |
65
+ | 2.0326 | 7.0 | 70 | 1.8601 | 14.7364 | 5.5533 | 10.3599 | 13.9586 |
66
+ | 1.938 | 8.0 | 80 | 1.8050 | 14.8895 | 6.0535 | 9.9969 | 14.4782 |
67
+ | 1.8462 | 9.0 | 90 | 1.7492 | 14.0282 | 5.8353 | 9.232 | 13.2213 |
68
+ | 1.7767 | 10.0 | 100 | 1.7214 | 16.7779 | 7.2314 | 11.1359 | 16.1369 |
69
+ | 1.7042 | 11.0 | 110 | 1.6857 | 18.4084 | 8.7509 | 12.7906 | 17.8835 |
70
+ | 1.6543 | 12.0 | 120 | 1.6610 | 19.2909 | 8.9371 | 13.1256 | 17.6865 |
71
+ | 1.5958 | 13.0 | 130 | 1.6335 | 19.8664 | 9.7174 | 13.6907 | 18.8411 |
72
+ | 1.5414 | 14.0 | 140 | 1.6145 | 19.2112 | 9.6741 | 14.1273 | 17.7185 |
73
+ | 1.496 | 15.0 | 150 | 1.6234 | 18.8087 | 9.0827 | 13.6381 | 17.6146 |
74
+ | 1.4534 | 16.0 | 160 | 1.6035 | 19.4539 | 10.135 | 14.4283 | 18.5099 |
75
+ | 1.4177 | 17.0 | 170 | 1.5948 | 19.6367 | 10.405 | 14.0816 | 18.0333 |
76
+ | 1.3742 | 18.0 | 180 | 1.5712 | 18.8434 | 10.1431 | 13.7222 | 17.6519 |
77
+ | 1.3378 | 19.0 | 190 | 1.5829 | 18.9662 | 10.7079 | 13.9422 | 18.1457 |
78
+ | 1.3068 | 20.0 | 200 | 1.5746 | 20.724 | 11.3974 | 15.1529 | 19.8343 |
79
+ | 1.2669 | 21.0 | 210 | 1.5476 | 19.0993 | 9.6869 | 13.815 | 18.5096 |
80
+ | 1.2315 | 22.0 | 220 | 1.5606 | 20.4637 | 10.7418 | 14.634 | 19.5588 |
81
+ | 1.2005 | 23.0 | 230 | 1.5617 | 19.3271 | 9.8272 | 14.2547 | 18.5378 |
82
+ | 1.1649 | 24.0 | 240 | 1.5618 | 20.3699 | 11.3093 | 14.2115 | 19.4149 |
83
+ | 1.1344 | 25.0 | 250 | 1.5649 | 20.8124 | 11.3997 | 15.8717 | 20.0457 |
84
+ | 1.099 | 26.0 | 260 | 1.5985 | 19.8977 | 9.9926 | 14.1038 | 19.0059 |
85
+ | 1.065 | 27.0 | 270 | 1.5678 | 20.7049 | 10.9546 | 14.4462 | 19.5927 |
86
+ | 1.0344 | 28.0 | 280 | 1.6225 | 21.3939 | 11.2821 | 15.0261 | 20.3781 |
87
+ | 1.0029 | 29.0 | 290 | 1.5831 | 20.7287 | 11.0327 | 14.3893 | 19.9485 |
88
+ | 0.9711 | 30.0 | 300 | 1.6223 | 20.4859 | 10.2651 | 14.7662 | 19.2553 |
89
+
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.41.2
94
+ - Pytorch 2.1.2
95
+ - Datasets 2.19.2
96
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 0,
3
+ "decoder_start_token_id": 2,
4
+ "early_stopping": true,
5
+ "eos_token_id": 2,
6
+ "length_penalty": 2.0,
7
+ "max_length": 128,
8
+ "min_length": 40,
9
+ "no_repeat_ngram_size": 3,
10
+ "num_beams": 2,
11
+ "pad_token_id": 1,
12
+ "transformers_version": "4.41.2",
13
+ "use_cache": false
14
+ }
runs/Jul21_10-42-00_a2dbd946cdd2/events.out.tfevents.1721558953.a2dbd946cdd2.34.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:db86bbaf836b30fdd965cb1c11d9affeaaa31f9fb969e7296b15ad58309c29e9
3
- size 25584
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:76bae21ebf99b035d8692c96d6ff5aaeb276afd7a0ac1efc2391736c38547d54
3
+ size 26412
runs/Jul21_10-42-00_a2dbd946cdd2/events.out.tfevents.1721573667.a2dbd946cdd2.34.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:76380a3ca7e4011616f37c8d5e08b20161579457c32656489da451e453a8f783
3
+ size 562