EXLMR / README.md
Hailay's picture
Update README.md
5d5b06e verified
|
raw
history blame
649 Bytes

license: apache-2.0 language:

  • ti
  • am

Model Card

EXLMR has been designed with specific support for underrepresented languages, particularly those spoken in Ethiopia (such as Amharic, Tigrinya, and Afaan Oromo).Like XLM-RoBERTa, EXLMR can handle multiple languages simultaneously, making it effective for cross-lingual tasks such as machine translation, multilingual text classification, and question answering.EXLMR-base follows the same architecture as RoBERTa-base, with 12 layers, 768 hidden dimensions, and 12 attention heads, totaling approximately 270M parameters.

Model Vocabulary Size
XLM-Roberta 250002
EXLMR 280147