TaniyaHaghighi commited on
Commit
8e66b42
·
1 Parent(s): 3b1bed9

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -12
README.md CHANGED
@@ -16,12 +16,12 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 2.6359
20
- - Rouge1: 0.2253
21
- - Rouge2: 0.0794
22
- - Rougel: 0.1915
23
- - Rougelsum: 0.1915
24
- - Gen Len: 17.9544
25
 
26
  ## Model description
27
 
@@ -46,17 +46,62 @@ The following hyperparameters were used during training:
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 5
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
54
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
55
- | No log | 1.0 | 7 | 2.8143 | 0.2214 | 0.0798 | 0.1876 | 0.1879 | 18.1767 |
56
- | No log | 2.0 | 14 | 2.7335 | 0.2218 | 0.0791 | 0.1882 | 0.1885 | 18.1033 |
57
- | No log | 3.0 | 21 | 2.6785 | 0.2232 | 0.0791 | 0.1897 | 0.19 | 18.0456 |
58
- | No log | 4.0 | 28 | 2.6468 | 0.2249 | 0.0793 | 0.1914 | 0.1915 | 17.9556 |
59
- | No log | 5.0 | 35 | 2.6359 | 0.2253 | 0.0794 | 0.1915 | 0.1915 | 17.9544 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
60
 
61
 
62
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.1869
20
+ - Rouge1: 0.4942
21
+ - Rouge2: 0.346
22
+ - Rougel: 0.4743
23
+ - Rougelsum: 0.474
24
+ - Gen Len: 13.61
25
 
26
  ## Model description
27
 
 
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 50
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
54
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
55
+ | 2.835 | 1.0 | 57 | 2.0475 | 0.2345 | 0.0895 | 0.2074 | 0.2067 | 16.82 |
56
+ | 2.3806 | 2.0 | 114 | 1.7243 | 0.2854 | 0.126 | 0.2649 | 0.2667 | 15.83 |
57
+ | 2.1701 | 3.0 | 171 | 1.5659 | 0.3633 | 0.211 | 0.3445 | 0.3444 | 14.72 |
58
+ | 2.0194 | 4.0 | 228 | 1.4690 | 0.4301 | 0.2752 | 0.4115 | 0.4127 | 13.95 |
59
+ | 1.9218 | 5.0 | 285 | 1.4159 | 0.468 | 0.3125 | 0.4499 | 0.4493 | 13.26 |
60
+ | 1.8645 | 6.0 | 342 | 1.3811 | 0.487 | 0.3462 | 0.472 | 0.4715 | 13.22 |
61
+ | 1.8254 | 7.0 | 399 | 1.3494 | 0.4836 | 0.3424 | 0.4668 | 0.466 | 13.17 |
62
+ | 1.7843 | 8.0 | 456 | 1.3277 | 0.4818 | 0.3444 | 0.4676 | 0.4672 | 13.04 |
63
+ | 1.7543 | 9.0 | 513 | 1.3084 | 0.4775 | 0.3412 | 0.463 | 0.4627 | 13.15 |
64
+ | 1.7206 | 10.0 | 570 | 1.2961 | 0.476 | 0.3393 | 0.4602 | 0.4617 | 13.13 |
65
+ | 1.6977 | 11.0 | 627 | 1.2823 | 0.4762 | 0.3395 | 0.4603 | 0.462 | 13.18 |
66
+ | 1.6725 | 12.0 | 684 | 1.2701 | 0.4841 | 0.3437 | 0.4677 | 0.4694 | 13.28 |
67
+ | 1.6479 | 13.0 | 741 | 1.2649 | 0.4912 | 0.3505 | 0.4755 | 0.4778 | 13.37 |
68
+ | 1.6313 | 14.0 | 798 | 1.2546 | 0.4896 | 0.344 | 0.4724 | 0.4742 | 13.47 |
69
+ | 1.6154 | 15.0 | 855 | 1.2488 | 0.4898 | 0.3456 | 0.4738 | 0.476 | 13.48 |
70
+ | 1.5932 | 16.0 | 912 | 1.2433 | 0.4935 | 0.3506 | 0.4776 | 0.4806 | 13.47 |
71
+ | 1.5716 | 17.0 | 969 | 1.2347 | 0.4984 | 0.3529 | 0.4789 | 0.4815 | 13.46 |
72
+ | 1.5523 | 18.0 | 1026 | 1.2314 | 0.4881 | 0.3456 | 0.4713 | 0.4722 | 13.48 |
73
+ | 1.5393 | 19.0 | 1083 | 1.2277 | 0.4925 | 0.35 | 0.4754 | 0.4761 | 13.57 |
74
+ | 1.535 | 20.0 | 1140 | 1.2239 | 0.4866 | 0.3415 | 0.4693 | 0.4708 | 13.63 |
75
+ | 1.5389 | 21.0 | 1197 | 1.2178 | 0.4785 | 0.3359 | 0.463 | 0.4621 | 13.56 |
76
+ | 1.5203 | 22.0 | 1254 | 1.2132 | 0.4837 | 0.3362 | 0.4679 | 0.4682 | 13.75 |
77
+ | 1.4909 | 23.0 | 1311 | 1.2098 | 0.4877 | 0.3393 | 0.4716 | 0.4719 | 13.71 |
78
+ | 1.4957 | 24.0 | 1368 | 1.2102 | 0.4874 | 0.3393 | 0.4713 | 0.4714 | 13.66 |
79
+ | 1.4746 | 25.0 | 1425 | 1.2076 | 0.4881 | 0.3398 | 0.4725 | 0.4717 | 13.66 |
80
+ | 1.4745 | 26.0 | 1482 | 1.2041 | 0.496 | 0.3474 | 0.4799 | 0.4792 | 13.64 |
81
+ | 1.4605 | 27.0 | 1539 | 1.2040 | 0.4903 | 0.3416 | 0.4741 | 0.4733 | 13.65 |
82
+ | 1.4465 | 28.0 | 1596 | 1.2024 | 0.4961 | 0.3461 | 0.4793 | 0.4784 | 13.7 |
83
+ | 1.4398 | 29.0 | 1653 | 1.2006 | 0.4859 | 0.3385 | 0.4692 | 0.4698 | 13.65 |
84
+ | 1.4469 | 30.0 | 1710 | 1.1976 | 0.4887 | 0.3426 | 0.473 | 0.4718 | 13.69 |
85
+ | 1.4218 | 31.0 | 1767 | 1.1965 | 0.4934 | 0.3469 | 0.4778 | 0.4764 | 13.64 |
86
+ | 1.4315 | 32.0 | 1824 | 1.1966 | 0.488 | 0.3447 | 0.4726 | 0.472 | 13.51 |
87
+ | 1.4282 | 33.0 | 1881 | 1.1957 | 0.488 | 0.3447 | 0.4726 | 0.472 | 13.51 |
88
+ | 1.396 | 34.0 | 1938 | 1.1932 | 0.489 | 0.3459 | 0.4739 | 0.4729 | 13.52 |
89
+ | 1.4028 | 35.0 | 1995 | 1.1941 | 0.4892 | 0.3434 | 0.4723 | 0.4722 | 13.63 |
90
+ | 1.4068 | 36.0 | 2052 | 1.1922 | 0.4895 | 0.347 | 0.4733 | 0.4722 | 13.57 |
91
+ | 1.3831 | 37.0 | 2109 | 1.1911 | 0.4927 | 0.3451 | 0.4742 | 0.474 | 13.63 |
92
+ | 1.3781 | 38.0 | 2166 | 1.1903 | 0.4896 | 0.3434 | 0.4717 | 0.4714 | 13.57 |
93
+ | 1.3867 | 39.0 | 2223 | 1.1889 | 0.4915 | 0.3464 | 0.4736 | 0.4729 | 13.63 |
94
+ | 1.3694 | 40.0 | 2280 | 1.1893 | 0.492 | 0.3444 | 0.4728 | 0.4723 | 13.58 |
95
+ | 1.3912 | 41.0 | 2337 | 1.1891 | 0.4902 | 0.3448 | 0.4719 | 0.4713 | 13.46 |
96
+ | 1.3793 | 42.0 | 2394 | 1.1886 | 0.492 | 0.3444 | 0.4728 | 0.4723 | 13.58 |
97
+ | 1.3664 | 43.0 | 2451 | 1.1884 | 0.4907 | 0.3434 | 0.4717 | 0.4714 | 13.53 |
98
+ | 1.3787 | 44.0 | 2508 | 1.1874 | 0.4919 | 0.3442 | 0.4725 | 0.472 | 13.61 |
99
+ | 1.3692 | 45.0 | 2565 | 1.1871 | 0.4919 | 0.3442 | 0.4725 | 0.472 | 13.61 |
100
+ | 1.3732 | 46.0 | 2622 | 1.1875 | 0.492 | 0.3444 | 0.4728 | 0.4723 | 13.58 |
101
+ | 1.3752 | 47.0 | 2679 | 1.1872 | 0.4942 | 0.346 | 0.4743 | 0.474 | 13.61 |
102
+ | 1.3581 | 48.0 | 2736 | 1.1871 | 0.4942 | 0.346 | 0.4743 | 0.474 | 13.61 |
103
+ | 1.3509 | 49.0 | 2793 | 1.1869 | 0.4942 | 0.346 | 0.4743 | 0.474 | 13.61 |
104
+ | 1.3752 | 50.0 | 2850 | 1.1869 | 0.4942 | 0.346 | 0.4743 | 0.474 | 13.61 |
105
 
106
 
107
  ### Framework versions