- vocab_size が 24576 サイズになるように作成した日本語 tokenizer
- 学習元は wikpedia (日本語のみ)
- unidic + sentencepiece(unigram) で学習
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.