pszemraj commited on
Commit
e726bad
·
1 Parent(s): a0560e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -16
README.md CHANGED
@@ -3,17 +3,17 @@ tags:
3
  - generated_from_trainer
4
  metrics:
5
  - rouge
6
- model-index:
7
- - name: summ-long-t5-tglobal-xl-qmsum-v1--16384-hiLR
8
- results: []
 
 
 
9
  ---
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
 
14
- # summ-long-t5-tglobal-xl-qmsum-v1--16384-hiLR
15
-
16
- This model is a fine-tuned version of [pszemraj/long-t5-tglobal-xl-qmsum-v1](https://huggingface.co/pszemraj/long-t5-tglobal-xl-qmsum-v1) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
  - Loss: 2.0505
19
  - Rouge1: 35.3881
@@ -57,11 +57,3 @@ The following hyperparameters were used during training:
57
  | 1.5376 | 1.0 | 99 | 2.0104 | 35.8802 | 11.4595 | 23.6656 | 31.49 | 77.77 |
58
  | 1.499 | 2.0 | 198 | 2.0358 | 35.1265 | 11.549 | 23.1062 | 30.8815 | 88.88 |
59
  | 1.5034 | 3.0 | 297 | 2.0505 | 35.3881 | 11.509 | 23.1543 | 31.3295 | 80.8 |
60
-
61
-
62
- ### Framework versions
63
-
64
- - Transformers 4.29.2
65
- - Pytorch 2.0.1+cu118
66
- - Datasets 2.12.0
67
- - Tokenizers 0.13.3
 
3
  - generated_from_trainer
4
  metrics:
5
  - rouge
6
+ license: bsd
7
+ datasets:
8
+ - pszemraj/qmsum-cleaned
9
+ language:
10
+ - en
11
+ pipeline_tag: summarization
12
  ---
13
 
14
+ # long-t5-tglobal-xl-qmsum-wip
 
15
 
16
+ This model is a fine-tuned version of [google/long-t5-tglobal-xl](https://huggingface.co/google/long-t5-tglobal-xl) on the `pszemraj/qmsum-cleaned` dataset.
 
 
17
  It achieves the following results on the evaluation set:
18
  - Loss: 2.0505
19
  - Rouge1: 35.3881
 
57
  | 1.5376 | 1.0 | 99 | 2.0104 | 35.8802 | 11.4595 | 23.6656 | 31.49 | 77.77 |
58
  | 1.499 | 2.0 | 198 | 2.0358 | 35.1265 | 11.549 | 23.1062 | 30.8815 | 88.88 |
59
  | 1.5034 | 3.0 | 297 | 2.0505 | 35.3881 | 11.509 | 23.1543 | 31.3295 | 80.8 |