roberta-small-greek / README.md
ClassCat's picture
Update README.md
acaccd3
|
raw
history blame
1.11 kB
metadata
language: el
license: cc-by-sa-4.0
datasets:
  - cc100
  - oscar
  - wikipedia
widget:
  - text: Έχει πολύ καιρό που δεν έχουμε <mask>.
  - text: Ευχαριστώ για το <mask> σου.
  - text: Αυτό είναι <mask>.
  - text: Ανοιξα <mask>.
  - text: Ευχαριστώ για <mask>.
  - text: Έχει πολύ καιρό που δεν <mask>.

RoBERTa Greek small model (Uncased)

Prerequisites

transformers==4.19.2

Model architecture

This model uses approximately half the size of RoBERTa base model parameters.

Tokenizer

Using BPE tokenizer with vocabulary size 50,000.

Training Data

Usage

from transformers import pipeline

unmasker = pipeline('fill-mask', model='ClassCat/roberta-small-greek')
unmasker("Έχει πολύ καιρό που δεν <mask>.")