roberta-base-latin-ud-goeswith

Model Description

This is a RoBERTa model pre-trained on CC-100 Latin texts for POS-tagging and dependency-parsing (using goeswith for subwords), derived from roberta-base-latin-v2.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/roberta-base-latin-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("deus videt te non sentientem"))
Downloads last month
114
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for KoichiYasuoka/roberta-base-latin-ud-goeswith

Finetuned
(2)
this model

Dataset used to train KoichiYasuoka/roberta-base-latin-ud-goeswith