Commit
路
41e2d13
1
Parent(s):
054bc72
Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,4 @@
|
|
1 |
-
## Persian XLM-RoBERTa Large For
|
2 |
|
3 |
XLM-RoBERTA is a multilingual language model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116v2) by Conneau et al. .
|
4 |
|
|
|
1 |
+
## Persian XLM-RoBERTa Large For Question Answering Task
|
2 |
|
3 |
XLM-RoBERTA is a multilingual language model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116v2) by Conneau et al. .
|
4 |
|