Model Card for Ancient Greek to English Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: GreTa
  • Tokenizer: GreTa
  • Language(s): Ancient Greek (source) → English (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: BH (Bible Hub)
  • Text Preprocessing: Diacritics
  • Morphological Encoding: emb-auto

Model Performance

  • BLEU Score: 54.18
  • SemScore: 0.86

Model Sources

Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train mrapacz/interlinear-en-greta-emb-auto-diacritics-bh