Alikhan Urumov commited on
Commit
8bf140a
1 Parent(s): f6c340c

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -1
README.md CHANGED
@@ -1,6 +1,8 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
 
 
4
  model-index:
5
  - name: t5-russian-spell
6
  results: []
@@ -12,6 +14,13 @@ should probably proofread and complete it, then remove this comment. -->
12
  # t5-russian-spell
13
 
14
  This model is a fine-tuned version of [sberbank-ai/ruT5-base](https://huggingface.co/sberbank-ai/ruT5-base) on the None dataset.
 
 
 
 
 
 
 
15
 
16
  ## Model description
17
 
@@ -30,7 +39,7 @@ More information needed
30
  ### Training hyperparameters
31
 
32
  The following hyperparameters were used during training:
33
- - learning_rate: 0.0002
34
  - train_batch_size: 8
35
  - eval_batch_size: 8
36
  - seed: 42
@@ -39,6 +48,16 @@ The following hyperparameters were used during training:
39
  - num_epochs: 1
40
  - mixed_precision_training: Native AMP
41
 
 
 
 
 
 
 
 
 
 
 
42
  ### Framework versions
43
 
44
  - Transformers 4.17.0
 
1
  ---
2
  tags:
3
  - generated_from_trainer
4
+ metrics:
5
+ - rouge
6
  model-index:
7
  - name: t5-russian-spell
8
  results: []
 
14
  # t5-russian-spell
15
 
16
  This model is a fine-tuned version of [sberbank-ai/ruT5-base](https://huggingface.co/sberbank-ai/ruT5-base) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.4066
19
+ - Rouge1: 44.2214
20
+ - Rouge2: 21.688
21
+ - Rougel: 44.2793
22
+ - Rougelsum: 44.0781
23
+ - Gen Len: 60.87
24
 
25
  ## Model description
26
 
 
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
+ - learning_rate: 2e-05
43
  - train_batch_size: 8
44
  - eval_batch_size: 8
45
  - seed: 42
 
48
  - num_epochs: 1
49
  - mixed_precision_training: Native AMP
50
 
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
55
+ | 0.2958 | 0.2 | 2500 | 0.4393 | 43.9635 | 21.3982 | 43.9784 | 43.8423 | 61.338 |
56
+ | 0.2427 | 0.4 | 5000 | 0.4460 | 44.609 | 22.1448 | 44.6314 | 44.4817 | 61.028 |
57
+ | 0.5326 | 0.6 | 7500 | 0.4100 | 44.7071 | 21.9365 | 44.7491 | 44.5944 | 60.844 |
58
+ | 0.5262 | 0.8 | 10000 | 0.4066 | 44.2214 | 21.688 | 44.2793 | 44.0781 | 60.87 |
59
+
60
+
61
  ### Framework versions
62
 
63
  - Transformers 4.17.0