File size: 293 Bytes
fc08310
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
---
library_name: transformers
tags: [Danish, BPE Tokenization, CerebrasGPT]
---

### STD-BPE-CEREBRAS

A standard CerebrasGPT-111M model using a pretrained Byte-Pair-Encoding (BPE) tokenizer. This model is used as a baseline for understanding how pretrained tokenizers perform on Danish text.