File size: 2,664 Bytes
3047552 81c16d6 3047552 81c16d6 3047552 81c16d6 3047552 627d942 3047552 81c16d6 627d942 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
---
language: uz
tags:
- uzbert
- uzbek
- bert
- cyrillic
license: MIT
datasets:
- webcrawl
---
# UzBERT base model (uncased)
Pretrained model on Uzbek language (Cyrillic script) using a masked
language modeling and next sentence prediction objectives.
## How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='coppercitylabs/uzbert-base-uncased')
>>> unmasker("Алишер Навоий – улуғ ўзбек ва бошқа туркий халқларнинг [MASK], мутафаккири ва давлат арбоби бўлган.")
[
{
'token_str': 'шоири',
'token': 13587,
'score': 0.7974384427070618,
'sequence': 'алишер навоий – улуғ ўзбек ва бошқа туркий халқларнинг шоири, мутафаккир ##и ва давлат арбоби бўлган.'
},
{
'token_str': 'олими',
'token': 18500,
'score': 0.09166576713323593,
'sequence': 'алишер навоий – улуғ ўзбек ва бошқа туркий халқларнинг олими, мутафаккир ##и ва давлат арбоби бўлган.'
},
{
'token_str': 'асосчиси',
'token': 7469,
'score': 0.02451123297214508,
'sequence': 'алишер навоий – улуғ ўзбек ва бошқа туркий халқларнинг асосчиси, мутафаккир ##и ва давлат арбоби бўлган.'
},
{
'token_str': 'ёзувчиси',
'token': 22439,
'score': 0.017601722851395607,
'sequence': 'алишер навоий – улуғ ўзбек ва бошқа туркий халқларнинг ёзувчиси, мутафаккир ##и ва давлат арбоби бўлган.'
},
{
'token_str': 'устози',
'token': 11494,
'score': 0.010115668177604675,
'sequence': 'алишер навоий – улуғ ўзбек ва бошқа туркий халқларнинг устози, мутафаккир ##и ва давлат арбоби бўлган.'
}
]
```
## Training data
UzBERT model was pretrained on \~625K news articles (\~142M words).
## BibTeX entry and citation info
```bibtex
@misc{mansurov2021uzbert,
title={{UzBERT: pretraining a BERT model for Uzbek}},
author={B. Mansurov and A. Mansurov},
year={2021},
eprint={2108.09814},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|