language: | |
- en | |
- mt | |
tags: | |
- translation | |
license: cc-by-4.0 | |
### Translation model for en-mt HPLT v1.0 | |
This repository contains the model weights for translation models trained with Marian for HPLT project. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository. | |
* Source language: en | |
* Target language: mt | |
* Dataset: HPLT only | |
* Model: transformer-base | |
* Tokenizer: SentencePiece (Unigram) | |
* Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-mt/raw/v0) | |
To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository. | |
## Benchmarks | |
| testset | BLEU | chr-F | comet | | |
| -------------------------------------- | ---- | ----- | ----- | | |
| flores200 | 30.6 | 60.7 | 0.7844 | | |
| ntrex | 24.1 | 54.3 | 0.7613 | | |