dannoncaffeine commited on
Commit
41bba85
·
1 Parent(s): 0e44c20

docs: update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -4
README.md CHANGED
@@ -6,20 +6,29 @@ tags:
6
  model-index:
7
  - name: GPT2-124M-wikitext-v0.1
8
  results: []
 
 
 
 
 
 
 
 
 
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
- # GPT2-124M-wikitext-v0.1
15
 
16
- This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
  - Loss: 2.9841
19
 
20
  ## Model description
21
 
22
- More information needed
23
 
24
  ## Intended uses & limitations
25
 
@@ -56,4 +65,4 @@ The following hyperparameters were used during training:
56
  - Transformers 4.35.2
57
  - Pytorch 2.1.0+cu118
58
  - Datasets 2.15.0
59
- - Tokenizers 0.15.0
 
6
  model-index:
7
  - name: GPT2-124M-wikitext-v0.1
8
  results: []
9
+ datasets:
10
+ - wikitext
11
+ pipeline_tag: text-generation
12
+ co2_eq_emissions:
13
+ emissions: 500
14
+ training_type: "fine-tuning"
15
+ source: "mlco2"
16
+ geographical_location: "Bucharest, Romania"
17
+ hardware_used: "1 x RTX 4090 GPU"
18
  ---
19
 
20
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
21
  should probably proofread and complete it, then remove this comment. -->
22
 
23
+ # 🧠 GPT2-124M-wikitext-v0.1
24
 
25
+ This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the [wikitext](https://huggingface.co/datasets/wikitext).
26
  It achieves the following results on the evaluation set:
27
  - Loss: 2.9841
28
 
29
  ## Model description
30
 
31
+ This is a practical hands-on experience for better understanding 🤗 Transformers and 🤗 Datasets. This model is GPT2(124M) fine-tuned on wikitext(103-raw-v1) on 1 x RTX 4090.
32
 
33
  ## Intended uses & limitations
34
 
 
65
  - Transformers 4.35.2
66
  - Pytorch 2.1.0+cu118
67
  - Datasets 2.15.0
68
+ - Tokenizers 0.15.0