KoBERT

How to use

If you want to import KoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/kobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert", trust_remote_code=True)

Reference

Downloads last month
437,778
Safetensors
Model size
92.2M params
Tensor type
F32
Β·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for monologg/kobert

Finetunes
7 models
Quantizations
1 model

Spaces using monologg/kobert 5