Update README.md (#1)
Browse files- Update README.md (a81c9cb949a613092e1ec15ac0a1464a36367dc2)
Co-authored-by: Zhuoyuan Mao <[email protected]>
README.md
CHANGED
@@ -212,12 +212,18 @@ Details about data, training, evaluation and performance metrics are available i
|
|
212 |
### BibTeX entry and citation info
|
213 |
|
214 |
```bibtex
|
215 |
-
@
|
216 |
-
|
217 |
-
|
218 |
-
|
219 |
-
|
220 |
-
|
221 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
222 |
}
|
223 |
```
|
|
|
212 |
### BibTeX entry and citation info
|
213 |
|
214 |
```bibtex
|
215 |
+
@inproceedings{mao-nakagawa-2023-lealla,
|
216 |
+
title = "{LEALLA}: Learning Lightweight Language-agnostic Sentence Embeddings with Knowledge Distillation",
|
217 |
+
author = "Mao, Zhuoyuan and
|
218 |
+
Nakagawa, Tetsuji",
|
219 |
+
booktitle = "Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics",
|
220 |
+
month = may,
|
221 |
+
year = "2023",
|
222 |
+
address = "Dubrovnik, Croatia",
|
223 |
+
publisher = "Association for Computational Linguistics",
|
224 |
+
url = "https://aclanthology.org/2023.eacl-main.138",
|
225 |
+
doi = "10.18653/v1/2023.eacl-main.138",
|
226 |
+
pages = "1886--1894",
|
227 |
+
abstract = "Large-scale language-agnostic sentence embedding models such as LaBSE (Feng et al., 2022) obtain state-of-the-art performance for parallel sentence alignment. However, these large-scale models can suffer from inference speed and computation overhead. This study systematically explores learning language-agnostic sentence embeddings with lightweight models. We demonstrate that a thin-deep encoder can construct robust low-dimensional sentence embeddings for 109 languages. With our proposed distillation methods, we achieve further improvements by incorporating knowledge from a teacher model. Empirical results on Tatoeba, United Nations, and BUCC show the effectiveness of our lightweight models. We release our lightweight language-agnostic sentence embedding models LEALLA on TensorFlow Hub.",
|
228 |
}
|
229 |
```
|