hajekad commited on
Commit
d43f0cd
·
1 Parent(s): ac29a50

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,7 +7,7 @@ datasets:
7
  ---
8
 
9
  # CzeGPT-2
10
- CzeGPT-2 is a Czech version of GPT-2 language model by OpenAI with LM Head on top. The model has the same architectural dimensions as the GPT-2 small (12 layers, 12 heads, 1024 tokens on input/output, and embedding vectors with 768 dimensions) resulting in 124 M trainable parameters. The model was trained on 5 GB slice of cleaned csTenTen17 dataset.
11
 
12
  The model is a good building block for any down-stream task requiring autoregressive text generation.
13
 
 
7
  ---
8
 
9
  # CzeGPT-2
10
+ CzeGPT-2 is a Czech version of GPT-2 language model by OpenAI with LM Head on top. The model has the same architectural dimensions as the GPT-2 small (12 layers, 12 heads, 1024 tokens on input/output, and embedding vectors with 768 dimensions) resulting in 124 M trainable parameters. It was trained on a 5 GB slice of cleaned csTenTen17 dataset.
11
 
12
  The model is a good building block for any down-stream task requiring autoregressive text generation.
13