mrapacz's picture
Upload README.md with huggingface_hub
034dc90 verified
metadata
license: cc-by-sa-4.0
language:
  - en
metrics:
  - bleu
base_model:
  - mT5-large
library_name: transformers
datasets:
  - mrapacz/greek-interlinear-translations

Model Card for Ancient Greek to English Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: mT5-large
  • Tokenizer: mT5
  • Language(s): Ancient Greek (source) → English (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Diacritics
  • Morphological Encoding: emb-sum

Model Performance

  • BLEU Score: 0.83
  • SemScore: 0.34

Model Sources