metadata
language:
- gn
- multilingual
license: mit
datasets:
- wikipedia
- wiktionary
widget:
- text: 'Paraguay ha''e peteĩ táva oĩva [MASK] retãme '
- text: Augusto Roa Bastos ha'e peteĩ [MASK] arandu
metrics:
- f1
- accuracy
mBERT+gn-base-cased (multilingual-BERT+gn-base-cased)
BERT multilingual base model (cased, pre-trained BERT model) fine-tuned for Guarani language modeling (104 languages + gn). Trained on Wikipedia + Wiktionary (~800K tokens).
How cite?
@article{aguero-et-al2023multi-affect-low-langs-grn,
title={Multidimensional Affective Analysis for Low-resource Languages: A Use Case with Guarani-Spanish Code-switching Language},
author={Agüero-Torales, Marvin Matías, López-Herrera, Antonio Gabriel, and Vilares, David},
journal={Cognitive Computation},
year={2023},
publisher={Springer},
notes={Forthcoming}
}