BERT-i-base-cased (gnBERT-base-cased)

A pre-trained BERT model for Guarani (12 layers, cased). Trained on Wikipedia + Wiktionary (~800K tokens).

How cite?

@article{aguero-et-al2023multi-affect-low-langs-grn,
  title={Multidimensional Affective Analysis for Low-resource Languages: A Use Case with Guarani-Spanish Code-switching Language},
  author={Agüero-Torales, Marvin Matías, López-Herrera, Antonio Gabriel, and Vilares, David},
  journal={Cognitive Computation},
  year={2023},
  publisher={Springer},
  notes={Forthcoming}
}
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train mmaguero/gn-bert-base-cased

Space using mmaguero/gn-bert-base-cased 1

Collection including mmaguero/gn-bert-base-cased