DA-BPE-CEREBRAS

This CerebrasGPT-111M model employs a standard Byte-Pair-Encoding (BPE) tokenizer for Danish text. It serves as a benchmark to compare against morphology-aware tokenizers.

Downloads last month
146
Safetensors
Model size
111M params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including meelu/DA-BPE-CEREBRAS