asahi417 commited on
Commit
54b6794
·
1 Parent(s): 0da3151

model update

Browse files
Files changed (1) hide show
  1. README.md +91 -51
README.md CHANGED
@@ -21,9 +21,9 @@ widget:
21
  example_title: "Question Generation Example 2"
22
  - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
23
  example_title: "Question Generation Example 3"
24
- - text: "<hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress."
25
  example_title: "Answer Extraction Example 1"
26
- - text: "Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress. <hl>"
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
  - name: lmqg/mt5-base-dequad-multitask
@@ -36,45 +36,50 @@ model-index:
36
  type: default
37
  args: default
38
  metrics:
39
- - name: BLEU4
40
- type: bleu4
41
- value: 0.0037638715919786907
42
- - name: ROUGE-L
43
- type: rouge-l
44
- value: 0.08578655213486944
45
- - name: METEOR
46
- type: meteor
47
- value: 0.1055901831758648
48
- - name: BERTScore
49
- type: bertscore
50
- value: 0.7786051149573353
51
- - name: MoverScore
52
- type: moverscore
53
- value: 0.537714157008381
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  ---
55
 
56
  # Model Card of `lmqg/mt5-base-dequad-multitask`
57
- This model is fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) for question generation task on the
58
- [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
59
- This model is fine-tuned on the answer extraction task as well as the question generation.
60
 
61
- Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
62
-
63
- ```
64
-
65
- @inproceedings{ushio-etal-2022-generative,
66
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
67
- author = "Ushio, Asahi and
68
- Alva-Manchego, Fernando and
69
- Camacho-Collados, Jose",
70
- booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
71
- month = dec,
72
- year = "2022",
73
- address = "Abu Dhabi, U.A.E.",
74
- publisher = "Association for Computational Linguistics",
75
- }
76
-
77
- ```
78
 
79
  ### Overview
80
  - **Language model:** [google/mt5-base](https://huggingface.co/google/mt5-base)
@@ -87,38 +92,74 @@ Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](h
87
  ### Usage
88
  - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
89
  ```python
90
-
91
  from lmqg import TransformersQG
 
92
  # initialize model
93
- model = TransformersQG(language='en', model='lmqg/mt5-base-dequad-multitask')
 
94
  # model prediction
95
- question_answer = model.generate_qa("William Turner was an English painter who specialised in watercolour landscapes")
96
 
97
  ```
98
 
99
  - With `transformers`
100
  ```python
101
-
102
  from transformers import pipeline
103
- # initialize model
104
- pipe = pipeline("text2text-generation", 'lmqg/mt5-base-dequad-multitask')
 
105
  # answer extraction
106
- answer = pipe('extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress.')
 
107
  # question generation
108
- question = pipe('generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
109
 
110
  ```
111
 
112
- ## Evaluation Metrics
 
 
 
 
 
 
 
 
 
 
 
 
 
 
113
 
114
 
115
- ### Metrics
116
 
117
- | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
118
- |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
119
- | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) | default | 0.004 | 0.086 | 0.106 | 0.779 | 0.538 | [link](https://huggingface.co/lmqg/mt5-base-dequad-multitask/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_dequad.default.json) |
 
 
 
 
 
120
 
121
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
122
 
123
 
124
  ## Training hyperparameters
@@ -144,7 +185,6 @@ The full configuration can be found at [fine-tuning config file](https://hugging
144
 
145
  ## Citation
146
  ```
147
-
148
  @inproceedings{ushio-etal-2022-generative,
149
  title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
150
  author = "Ushio, Asahi and
 
21
  example_title: "Question Generation Example 2"
22
  - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
23
  example_title: "Question Generation Example 3"
24
+ - text: "extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress."
25
  example_title: "Answer Extraction Example 1"
26
+ - text: "extract answers: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress. <hl>"
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
  - name: lmqg/mt5-base-dequad-multitask
 
36
  type: default
37
  args: default
38
  metrics:
39
+ - name: BLEU4 (Question Generation)
40
+ type: bleu4_question_generation
41
+ value: 0.38
42
+ - name: ROUGE-L (Question Generation)
43
+ type: rouge_l_question_generation
44
+ value: 8.58
45
+ - name: METEOR (Question Generation)
46
+ type: meteor_question_generation
47
+ value: 10.56
48
+ - name: BERTScore (Question Generation)
49
+ type: bertscore_question_generation
50
+ value: 77.86
51
+ - name: MoverScore (Question Generation)
52
+ type: moverscore_question_generation
53
+ value: 53.77
54
+ - name: QAAlignedF1Score-BERTScore
55
+ type: qa_aligned_f1_score_bertscore
56
+ value: 6.11
57
+ - name: QAAlignedRecall-BERTScore
58
+ type: qa_aligned_recall_bertscore
59
+ value: 5.95
60
+ - name: QAAlignedPrecision-BERTScore
61
+ type: qa_aligned_precision_bertscore
62
+ value: 6.3
63
+ - name: QAAlignedF1Score-MoverScore
64
+ type: qa_aligned_f1_score_moverscore
65
+ value: 4.24
66
+ - name: QAAlignedRecall-MoverScore
67
+ type: qa_aligned_recall_moverscore
68
+ value: 4.15
69
+ - name: QAAlignedPrecision-MoverScore
70
+ type: qa_aligned_precision_moverscore
71
+ value: 4.34
72
+ - name: AnswerF1Score (Answer Extraction)
73
+ type: answer_f1_score_answer_extraction
74
+ value: 8.63
75
+ - name: AnswerExactMatch (Answer Extraction)
76
+ type: answer_exact_match_answer_extraction
77
+ value: 0.32
78
  ---
79
 
80
  # Model Card of `lmqg/mt5-base-dequad-multitask`
81
+ This model is fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) for question generation task and answer extraction jointly on the [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
 
 
82
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
83
 
84
  ### Overview
85
  - **Language model:** [google/mt5-base](https://huggingface.co/google/mt5-base)
 
92
  ### Usage
93
  - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
94
  ```python
 
95
  from lmqg import TransformersQG
96
+
97
  # initialize model
98
+ model = TransformersQG(language="en", model="lmqg/mt5-base-dequad-multitask")
99
+
100
  # model prediction
101
+ question_answer_pairs = model.generate_qa("William Turner was an English painter who specialised in watercolour landscapes")
102
 
103
  ```
104
 
105
  - With `transformers`
106
  ```python
 
107
  from transformers import pipeline
108
+
109
+ pipe = pipeline("text2text-generation", "lmqg/mt5-base-dequad-multitask")
110
+
111
  # answer extraction
112
+ answer = pipe("generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.")
113
+
114
  # question generation
115
+ question = pipe("extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress.")
116
 
117
  ```
118
 
119
+ ## Evaluation
120
+
121
+
122
+ - ***Metric (Question Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-dequad-multitask/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_dequad.default.json)
123
+
124
+ | | Score | Type | Dataset |
125
+ |:-----------|--------:|:--------|:-----------------------------------------------------------------|
126
+ | BERTScore | 77.86 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
127
+ | Bleu_1 | 8.19 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
128
+ | Bleu_2 | 3.17 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
129
+ | Bleu_3 | 1.12 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
130
+ | Bleu_4 | 0.38 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
131
+ | METEOR | 10.56 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
132
+ | MoverScore | 53.77 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
133
+ | ROUGE_L | 8.58 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
134
 
135
 
136
+ - ***Metric (Question & Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-dequad-multitask/raw/main/eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json)
137
 
138
+ | | Score | Type | Dataset |
139
+ |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
140
+ | QAAlignedF1Score (BERTScore) | 6.11 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
141
+ | QAAlignedF1Score (MoverScore) | 4.24 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
142
+ | QAAlignedPrecision (BERTScore) | 6.3 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
143
+ | QAAlignedPrecision (MoverScore) | 4.34 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
144
+ | QAAlignedRecall (BERTScore) | 5.95 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
145
+ | QAAlignedRecall (MoverScore) | 4.15 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
146
 
147
 
148
+ - ***Metric (Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-dequad-multitask/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_dequad.default.json)
149
+
150
+ | | Score | Type | Dataset |
151
+ |:-----------------|--------:|:--------|:-----------------------------------------------------------------|
152
+ | AnswerExactMatch | 0.32 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
153
+ | AnswerF1Score | 8.63 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
154
+ | BERTScore | 63.01 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
155
+ | Bleu_1 | 5.06 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
156
+ | Bleu_2 | 2.4 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
157
+ | Bleu_3 | 1.35 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
158
+ | Bleu_4 | 0.85 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
159
+ | METEOR | 5.34 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
160
+ | MoverScore | 47.54 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
161
+ | ROUGE_L | 3.93 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
162
+
163
 
164
 
165
  ## Training hyperparameters
 
185
 
186
  ## Citation
187
  ```
 
188
  @inproceedings{ushio-etal-2022-generative,
189
  title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
190
  author = "Ushio, Asahi and