abhik1505040 commited on
Commit
5849c79
1 Parent(s): 93a16ba

Updated README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -18
README.md CHANGED
@@ -55,7 +55,7 @@ widget:
55
 
56
  # mT5-m2o-english-CrossSum
57
 
58
- This repository contains the mT5 checkpoint finetuned on all cross-lingual pairs of the [CrossSum](https://huggingface.co/datasets/csebuetnlp/xlsum) dataset, where the target summary was in **english**, i.e. this model tries to summarize text written in any language in English. For finetuning details and scripts, see the [paper]() and the [official repository](https://github.com/csebuetnlp/CrossSum).
59
 
60
 
61
  ## Using this model in `transformers` (tested on 4.11.0.dev0)
@@ -103,22 +103,14 @@ print(summary)
103
 
104
  If you use this model, please cite the following paper:
105
  ```
106
- @inproceedings{hasan-etal-2021-xl,
107
- title = "{XL}-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages",
108
- author = "Hasan, Tahmid and
109
- Bhattacharjee, Abhik and
110
- Islam, Md. Saiful and
111
- Mubasshir, Kazi and
112
- Li, Yuan-Fang and
113
- Kang, Yong-Bin and
114
- Rahman, M. Sohel and
115
- Shahriyar, Rifat",
116
- booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
117
- month = aug,
118
- year = "2021",
119
- address = "Online",
120
- publisher = "Association for Computational Linguistics",
121
- url = "https://aclanthology.org/2021.findings-acl.413",
122
- pages = "4693--4703",
123
  }
124
  ```
 
55
 
56
  # mT5-m2o-english-CrossSum
57
 
58
+ This repository contains the many-to-one (m2o) mT5 checkpoint finetuned on all cross-lingual pairs of the [CrossSum](https://huggingface.co/datasets/csebuetnlp/CrossSum) dataset, where the target summary was in **english**, i.e. this model tries to **summarize text written in any language in English.** For finetuning details and scripts, see the [paper](https://arxiv.org/abs/2112.08804) and the [official repository](https://github.com/csebuetnlp/CrossSum).
59
 
60
 
61
  ## Using this model in `transformers` (tested on 4.11.0.dev0)
 
103
 
104
  If you use this model, please cite the following paper:
105
  ```
106
+ @article{hasan2021crosssum,
107
+ author = {Tahmid Hasan and Abhik Bhattacharjee and Wasi Uddin Ahmad and Yuan-Fang Li and Yong-bin Kang and Rifat Shahriyar},
108
+ title = {CrossSum: Beyond English-Centric Cross-Lingual Abstractive Text Summarization for 1500+ Language Pairs},
109
+ journal = {CoRR},
110
+ volume = {abs/2112.08804},
111
+ year = {2021},
112
+ url = {https://arxiv.org/abs/2112.08804},
113
+ eprinttype = {arXiv},
114
+ eprint = {2112.08804}
 
 
 
 
 
 
 
 
115
  }
116
  ```