asahi417 commited on
Commit
dafca19
·
1 Parent(s): 8df1e89

model update

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -14,11 +14,11 @@ pipeline_tag: text2text-generation
14
  tags:
15
  - question generation
16
  widget:
17
- - text: "generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
18
  example_title: "Question Generation Example 1"
19
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
20
  example_title: "Question Generation Example 2"
21
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/t5-small-squad-no-answer
@@ -33,19 +33,19 @@ model-index:
33
  metrics:
34
  - name: BLEU4
35
  type: bleu4
36
- value: 0.21485258550976716
37
  - name: ROUGE-L
38
  type: rouge-l
39
- value: 0.4814016162942198
40
  - name: METEOR
41
  type: meteor
42
- value: 0.23377448216932625
43
  - name: BERTScore
44
  type: bertscore
45
- value: 0.8989167797389837
46
  - name: MoverScore
47
  type: moverscore
48
- value: 0.6235067427760864
49
  ---
50
 
51
  # Language Models Fine-tuning on Question Generation: `lmqg/t5-small-squad-no-answer`
@@ -70,7 +70,7 @@ model_path = 'lmqg/t5-small-squad-no-answer'
70
  pipe = pipeline("text2text-generation", model_path)
71
 
72
  # Question Generation
73
- input_text = 'generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.'
74
  question = pipe(input_text)
75
  ```
76
 
@@ -81,7 +81,7 @@ question = pipe(input_text)
81
 
82
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
83
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
84
- | [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.21485258550976716 | 0.4814016162942198 | 0.23377448216932625 | 0.8989167797389837 | 0.6235067427760864 | [link](https://huggingface.co/lmqg/t5-small-squad-no-answer/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_squad.default.json) |
85
 
86
 
87
 
 
14
  tags:
15
  - question generation
16
  widget:
17
+ - text: "generate question: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl>"
18
  example_title: "Question Generation Example 1"
19
+ - text: "generate question: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl>"
20
  example_title: "Question Generation Example 2"
21
+ - text: "generate question: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records . <hl>"
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/t5-small-squad-no-answer
 
33
  metrics:
34
  - name: BLEU4
35
  type: bleu4
36
+ value: 0.21121852916203582
37
  - name: ROUGE-L
38
  type: rouge-l
39
+ value: 0.4746967055057577
40
  - name: METEOR
41
  type: meteor
42
+ value: 0.23384596803152297
43
  - name: BERTScore
44
  type: bertscore
45
+ value: 0.8964476409584947
46
  - name: MoverScore
47
  type: moverscore
48
+ value: 0.6207232474685432
49
  ---
50
 
51
  # Language Models Fine-tuning on Question Generation: `lmqg/t5-small-squad-no-answer`
 
70
  pipe = pipeline("text2text-generation", model_path)
71
 
72
  # Question Generation
73
+ input_text = 'generate question: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl>'
74
  question = pipe(input_text)
75
  ```
76
 
 
81
 
82
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
83
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
84
+ | [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.21121852916203582 | 0.4746967055057577 | 0.23384596803152297 | 0.8964476409584947 | 0.6207232474685432 | [link](https://huggingface.co/lmqg/t5-small-squad-no-answer/raw/main/eval/metric.first.sentence.paragraph_sentence.question.lmqg_qg_squad.default.json) |
85
 
86
 
87