pszemraj commited on
Commit
33c9eba
·
1 Parent(s): 2496e86

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -5
README.md CHANGED
@@ -6,12 +6,10 @@ model-index:
6
  results: []
7
  ---
8
 
9
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
- should probably proofread and complete it, then remove this comment. -->
11
 
12
- # long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP12-ft3-booksum
13
 
14
- This model is a fine-tuned version of [pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP12](https://huggingface.co/pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP12) on the None dataset.
15
 
16
  ## Model description
17
 
@@ -40,7 +38,7 @@ The following hyperparameters were used during training:
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: cosine
42
  - lr_scheduler_warmup_ratio: 0.01
43
- - num_epochs: 3
44
 
45
  ### Framework versions
46
 
 
6
  results: []
7
  ---
8
 
 
 
9
 
10
+ # long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
11
 
12
+ This model is a fine-tuned version of [pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP12](https://huggingface.co/pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP12) on the `kmfoda/booksum`. Evaluating some metric results before merging with the "main" wip version
13
 
14
  ## Model description
15
 
 
38
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
  - lr_scheduler_type: cosine
40
  - lr_scheduler_warmup_ratio: 0.01
41
+ - num_epochs: 1.1
42
 
43
  ### Framework versions
44