File size: 1,870 Bytes
3d1c108 d2d33fe 3d1c108 f66983b bb34170 431d81b bb34170 d2d33fe 0684c43 d2d33fe 0684c43 d2d33fe fe86558 d2d33fe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
---
language:
- fr
- en
metrics:
- bleu
pipeline_tag: translation
model-index:
- name: NMT-EN-FR
results:
- task:
type: translation
dataset:
name: UN Corpus
type: bilingual
metrics:
- name: BLEU
type: BLEU
value: 49
library_name: ctranslate2
license: cc-by-sa-4.0
---
# Model Details
French-to-English Machine Translation model trained by Yasmin Moslem.
This model depends on the Transformer (base) architecture.
The model was originally trained with OpenNMT-py and then converted to the CTranslate2 format for efficient inference.
## Tools
- OpenNMT-py
- CTranslate2
## Data
This model is trained on the French-to-English portion of the [UN Corpus](https://conferences.unite.un.org/UNCorpus/),
consisting of approx. 20 million segments.
## Tokenizer
The tokenizer was trained using [SentencePiece](https://github.com/google/sentencepiece) on shared vocabulary.
Hence, there is only one SentencePiece model that can be used for tokenizing both the source and target texts.
## Demo
A demo of this model is available at: https://www.machinetranslation.io/
The demo also illustrates word-level auto-suggestions with teacher forcing.
## Inference
If you want to run this model locally, you can use the [CTranslate2](https://github.com/OpenNMT/CTranslate2) library.
## Citation
```
@inproceedings{moslem-etal-2022-translation,
title = "Translation Word-Level Auto-Completion: What Can We Achieve Out of the Box?",
author = "Moslem, Yasmin and
Haque, Rejwanul and
Way, Andy",
booktitle = "Proceedings of the Seventh Conference on Machine Translation (WMT)",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.wmt-1.119",
pages = "1176--1181",
}
``` |