model update
Browse files
README.md
CHANGED
@@ -14,11 +14,11 @@ pipeline_tag: text2text-generation
|
|
14 |
tags:
|
15 |
- question generation
|
16 |
widget:
|
17 |
-
- text: "
|
18 |
example_title: "Question Generation Example 1"
|
19 |
-
- text: "
|
20 |
example_title: "Question Generation Example 2"
|
21 |
-
- text: "
|
22 |
example_title: "Question Generation Example 3"
|
23 |
model-index:
|
24 |
- name: lmqg/mt5-small-squad
|
@@ -231,7 +231,7 @@ model_path = 'lmqg/mt5-small-squad'
|
|
231 |
pipe = pipeline("text2text-generation", model_path)
|
232 |
|
233 |
# Question Generation
|
234 |
-
question = pipe('
|
235 |
```
|
236 |
|
237 |
## Evaluation Metrics
|
@@ -241,7 +241,7 @@ question = pipe('generate question: <hl> Beyonce <hl> further expanded her actin
|
|
241 |
|
242 |
| Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
|
243 |
|:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
|
244 |
-
| [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.
|
245 |
|
246 |
|
247 |
|
@@ -249,13 +249,13 @@ question = pipe('generate question: <hl> Beyonce <hl> further expanded her actin
|
|
249 |
|
250 |
| Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
|
251 |
|:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
|
252 |
-
| [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) | default | 0.
|
253 |
-
| [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) | default |
|
254 |
-
| [lmqg/qg_ruquad](https://huggingface.co/datasets/lmqg/qg_ruquad) | default |
|
255 |
-
| [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) | default |
|
256 |
-
| [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) | default | 0.
|
257 |
-
| [lmqg/qg_frquad](https://huggingface.co/datasets/lmqg/qg_frquad) | default | 0.
|
258 |
-
| [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) | default |
|
259 |
|
260 |
|
261 |
## Training hyperparameters
|
|
|
14 |
tags:
|
15 |
- question generation
|
16 |
widget:
|
17 |
+
- text: "<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
|
18 |
example_title: "Question Generation Example 1"
|
19 |
+
- text: "Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
|
20 |
example_title: "Question Generation Example 2"
|
21 |
+
- text: "Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
|
22 |
example_title: "Question Generation Example 3"
|
23 |
model-index:
|
24 |
- name: lmqg/mt5-small-squad
|
|
|
231 |
pipe = pipeline("text2text-generation", model_path)
|
232 |
|
233 |
# Question Generation
|
234 |
+
question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
|
235 |
```
|
236 |
|
237 |
## Evaluation Metrics
|
|
|
241 |
|
242 |
| Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
|
243 |
|:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
|
244 |
+
| [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.217 | 0.489 | 0.238 | 0.9 | 0.627 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_squad.default.json) |
|
245 |
|
246 |
|
247 |
|
|
|
249 |
|
250 |
| Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
|
251 |
|:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
|
252 |
+
| [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) | default | 0.005 | 0.05 | 0.059 | 0.726 | 0.502 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_itquad.default.json) |
|
253 |
+
| [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) | default | 0.0 | 0.061 | 0.005 | 0.661 | 0.465 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_jaquad.default.json) |
|
254 |
+
| [lmqg/qg_ruquad](https://huggingface.co/datasets/lmqg/qg_ruquad) | default | 0.0 | 0.01 | 0.018 | 0.709 | 0.491 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_ruquad.default.json) |
|
255 |
+
| [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) | default | 0.0 | 0.016 | 0.048 | 0.735 | 0.504 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_dequad.default.json) |
|
256 |
+
| [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) | default | 0.006 | 0.052 | 0.06 | 0.749 | 0.506 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_esquad.default.json) |
|
257 |
+
| [lmqg/qg_frquad](https://huggingface.co/datasets/lmqg/qg_frquad) | default | 0.017 | 0.158 | 0.082 | 0.729 | 0.51 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_frquad.default.json) |
|
258 |
+
| [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) | default | 0.0 | 0.001 | 0.007 | 0.663 | 0.459 | [link](https://huggingface.co/lmqg/mt5-small-squad/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_koquad.default.json) |
|
259 |
|
260 |
|
261 |
## Training hyperparameters
|