metadata
language:
- gn
- es
license: mit
datasets:
- wikipedia
- wiktionary
widget:
- text: 'Paraguay ha''e peteĩ táva oĩva [MASK] retãme '
- text: Augusto Roa Bastos ha'e peteĩ [MASK] arandu
metrics:
- f1
- accuracy
BETO+gn-base-cased
BETO-base-cased (pre-trained Spanish BERT model) fine-tuned for Guarani language modeling (Spanish + Guarani). Trained on Wikipedia + Wiktionary (~800K tokens).
How cite?
@article{aguero-et-al2023multi-affect-low-langs-grn,
title={Multidimensional Affective Analysis for Low-resource Languages: A Use Case with Guarani-Spanish Code-switching Language},
author={Agüero-Torales, Marvin Matías, López-Herrera, Antonio Gabriel, and Vilares, David},
journal={Cognitive Computation},
year={2023},
publisher={Springer},
notes={Forthcoming}
}