Model Card for Ancient Greek to English Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: PhilTa
  • Tokenizer: PhilTa
  • Language(s): Ancient Greek (source) → English (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Normalized
  • Morphological Encoding: emb-auto

Model Performance

  • BLEU Score: 56.51
  • SemScore: 0.87

Model Sources

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train mrapacz/interlinear-en-philta-emb-auto-normalized-ob