metadata
license: apache-2.0
language:
- lv
Latvian BERT base model (cased)
A pretrained model on the Latvian language using a masked language modeling (MLM) objective. It was introduced in this paper and first released via the CLARIN-LV repository.
This model is case-sensitive. It is primarily intended to be fine-tuned on downstream natural language understanding (NLU) tasks.
Developed at AiLab.lv
BibTeX entry and citation info
Please cite this paper if you use `lvbert':
@inproceedings{Znotins-Barzdins:2020:BalticHLT,
author = {Arturs Znotins and Guntis Barzdins},
title = {{LVBERT: Transformer-Based Model for Latvian Language Understanding}},
booktitle = {Human Language Technologies - The Baltic Perspective},
series = {Frontiers in Artificial Intelligence and Applications},
volume = {328},
publisher = {IOS Press},
year = {2020},
pages = {111-115},
doi = {10.3233/FAIA200610},
url = {http://ebooks.iospress.nl/volumearticle/55531}
}