modernbert-large-japanese-wikipedia-ud-square
Model Description
This is a ModernBERT model pretrained for POS-tagging and dependency-parsing (using goeswith
for subwords), derived from modernbert-large-japanese-wikipedia-upos and UD_Japanese-GSDLUW.
How to Use
from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/modernbert-large-japanese-wikipedia-ud-square",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))
Reference
安岡孝一: 青空文庫ModernBERTモデルによる国語研長単位係り受け解析, 情報処理学会研究報告, Vol.2025-CH-137『人文科学とコンピュータ』, No.10 (2025年2月8日), pp.1-7.
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.