DunnBC22 commited on
Commit
516cbce
1 Parent(s): 1f9ec55

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -11
README.md CHANGED
@@ -8,11 +8,11 @@ metrics:
8
  model-index:
9
  - name: mbart-large-50-English_German_Translation
10
  results: []
 
 
 
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
-
16
  # mbart-large-50-English_German_Translation
17
 
18
  This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50) on the None dataset.
@@ -24,19 +24,15 @@ It achieves the following results on the evaluation set:
24
 
25
  ## Model description
26
 
27
- Here is the link to the script I created to train this model:
28
-
29
- https://github.com/DunnBC22/NLP_Projects/blob/main/Machine%20Translation/NLP%20Translation%20Project-EN:DE.ipynb
30
 
31
  ## Intended uses & limitations
32
 
33
- More information needed
34
 
35
  ## Training and evaluation data
36
 
37
- Here is a the link to the page where I found this dataset:
38
-
39
- https://www.kaggle.com/datasets/hgultekin/paralel-translation-corpus-in-22-languages
40
 
41
  ## Training procedure
42
 
@@ -63,4 +59,4 @@ The following hyperparameters were used during training:
63
  - Transformers 4.22.2
64
  - Pytorch 1.12.1
65
  - Datasets 2.5.2
66
- - Tokenizers 0.12.1
 
8
  model-index:
9
  - name: mbart-large-50-English_German_Translation
10
  results: []
11
+ language:
12
+ - en
13
+ - de
14
  ---
15
 
 
 
 
16
  # mbart-large-50-English_German_Translation
17
 
18
  This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50) on the None dataset.
 
24
 
25
  ## Model description
26
 
27
+ Here is the link to the script I created to train this model: https://github.com/DunnBC22/NLP_Projects/blob/main/Machine%20Translation/NLP%20Translation%20Project-EN:DE.ipynb
 
 
28
 
29
  ## Intended uses & limitations
30
 
31
+ This model is intended to demonstrate my ability to solve a complex problem using technology.
32
 
33
  ## Training and evaluation data
34
 
35
+ Here is a the link to the page where I found this dataset: https://www.kaggle.com/datasets/hgultekin/paralel-translation-corpus-in-22-languages
 
 
36
 
37
  ## Training procedure
38
 
 
59
  - Transformers 4.22.2
60
  - Pytorch 1.12.1
61
  - Datasets 2.5.2
62
+ - Tokenizers 0.12.1