Text Generation
Transformers
Safetensors
Czech
mpt
custom_code
text-generation-inference
Inference Endpoints
mfajcik commited on
Commit
ce06b0d
1 Parent(s): ccc5b3a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,7 @@ language:
6
  - cs
7
  ---
8
  # Introduction
9
- CSMPT7b is a large Czech language model continously pretrained for 272b training steps from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
10
  Training was done on [Karolina](https://www.it4i.cz/en) cluster.
11
 
12
  # Evaluation
 
6
  - cs
7
  ---
8
  # Introduction
9
+ CSMPT7b is a large Czech language model continously pretrained for 272b training tokens from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
10
  Training was done on [Karolina](https://www.it4i.cz/en) cluster.
11
 
12
  # Evaluation