Translation
Fairseq
English
Icelandic
wmt
atlijas commited on
Commit
d7b83b2
·
verified ·
1 Parent(s): 9e1ac48

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +51 -10
README.md CHANGED
@@ -1,10 +1,51 @@
1
- ---
2
- license: apache-2.0
3
- language:
4
- - en
5
- - is
6
- library_name: fairseq
7
- tags:
8
- - translation
9
- - wmt
10
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ - is
6
+ library_name: fairseq
7
+ tags:
8
+ - translation
9
+ - wmt
10
+ ---
11
+
12
+ ## Model description
13
+ This is a translation model which translates text from English to Icelandic. It follows the architecture of the transformer model described in [Attention is All You Need](https://arxiv.org/pdf/1706.03762) and was trained with [fairseq](https://github.com/facebookresearch/fairseq) for [WMT24](https://www2.statmt.org/wmt24/).
14
+
15
+ This is the base version of our model. See also: [base_deep](hlekkur), [big](hlekkur), [big_deep](hlekkur).
16
+
17
+ | model | d_model | d_ff | h | N_enc | N_dec |
18
+ |:---------------|:----------------------|:-------------------|:--------------|:--------------------|:--------------------|
19
+ | Base | 512 | 2048 | 8 | 6 | 6 |
20
+ | Base_deep | 512 | 2048 | 8 | 36 | 12 |
21
+ | Big | 1024 | 4096 | 16 | 6 | 6 |
22
+ | Big_deep | 1024 | 4096 | 16 | 36 | 12 |
23
+
24
+
25
+ #### How to use
26
+
27
+ ```python
28
+ from fairseq.models.transformer import TransformerModel
29
+ TRANSLATION_MODEL_PATH = 'path/to/model.pt'
30
+ TRANSLATION_MODEL = TransformerModel.from_pretrained('path/to/path', checkpoint_file=TRANSLATION_MODEL_PATH, bpe='sentencepiece', sentencepiece_model='sentencepiece.bpe.model')
31
+ src_sentences = ['This is a test sentence.', 'This is another test sentence.']
32
+ translated_sentences = translate(translation_model=TRANSLATION_MODEL, sentences=src_sentences, beam=5)
33
+ print(translated_sentences)
34
+ ```
35
+
36
+ #### Limitations and bias
37
+
38
+ ## Training data
39
+
40
+ ## Eval results
41
+
42
+ ### BibTeX entry and citation info
43
+
44
+ ```bibtex
45
+ @inproceedings{...,
46
+ year={XXX},
47
+ title={XXX},
48
+ author={XXX},
49
+ booktitle={XXX},
50
+ }
51
+ ```