Model Card for Ancient Greek to Polish Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: PhilTa
  • Tokenizer: PhilTa
  • Language(s): Ancient Greek (source) → Polish (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: Unused
  • Text Preprocessing: Diacritics
  • Morphological Encoding: baseline (text only, no morphological tags)

Model Performance

  • BLEU Score: 0.03
  • SemScore: 0.18

Model Sources

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Dataset used to train mrapacz/interlinear-pl-philta-baseline-diacritics-unused