Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,8 @@
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
# Introduction
|
5 |
-
CSMPT7b is a large Czech language model continously pretrained for 272b training steps from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
|
|
|
6 |
|
7 |
# Evaluation
|
8 |
Dev eval at CS-HellaSwag (automatically translated HellaSwag benchmark).
|
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
# Introduction
|
5 |
+
CSMPT7b is a large Czech language model continously pretrained for 272b training steps from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
|
6 |
+
Training was done on [Karolina](https://www.it4i.cz/en) cluster.
|
7 |
|
8 |
# Evaluation
|
9 |
Dev eval at CS-HellaSwag (automatically translated HellaSwag benchmark).
|