--- language: multilingual tags: - biomedical - cross-lingual datasets: - UMLS ### SapBERT-XLMR SapBERT [(Liu et al. 2020)](https://arxiv.org/pdf/2010.11784.pdf) trained with [UMLS](https://www.nlm.nih.gov/research/umls/licensedcontent/umlsknowledgesources.html) 2020AB, using [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) as the base model. Please use [CLS] as the representation of the input. ### Citation ```bibtex @article{liu2020self, title={Self-alignment Pre-training for Biomedical Entity Representations}, author={Liu, Fangyu and Shareghi, Ehsan and Meng, Zaiqiao and Basaldella, Marco and Collier, Nigel}, journal={arXiv preprint arXiv:2010.11784}, year={2020} } ```