Model Card for Ancient Greek to Polish Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: mT5-large
  • Tokenizer: mT5
  • Language(s): Ancient Greek (source) → Polish (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Diacritics
  • Morphological Encoding: emb-sum

Model Performance

  • BLEU Score: 58.90
  • SemScore: 0.93

Model Sources

Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train mrapacz/interlinear-pl-mt5-large-emb-sum-diacritics-ob