Fix link to arxiv in README.md (#2)
Browse files- Fix link to arxiv in README.md (eaa4ebed8d699238c983ebb637f7b7619d469898)
Co-authored-by: Benaya Trabelsi <[email protected]>
README.md
CHANGED
@@ -5,6 +5,6 @@ NOTE: This model was only trained with sequences of up to 128 tokens.
|
|
5 |
|
6 |
When using AlephBertGimmel, please reference:
|
7 |
|
8 |
-
Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 [http://arxiv.org/abs/2211.15199
|
9 |
|
10 |
|
|
|
5 |
|
6 |
When using AlephBertGimmel, please reference:
|
7 |
|
8 |
+
Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 [arXiv:2211.15199](http://arxiv.org/abs/2211.15199)
|
9 |
|
10 |
|