Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
stabilityai
/
stablelm-2-zephyr-1_6b
like
181
Follow
Stability AI
9.77k
Text Generation
Transformers
Safetensors
GGUF
8 datasets
English
stablelm
causal-lm
conversational
Inference Endpoints
arxiv:
2305.18290
License:
other
Model card
Files
Files and versions
Community
19
Train
Deploy
Use this model
e1520e0
stablelm-2-zephyr-1_6b
7 contributors
History:
41 commits
jon-tow
fix(tokenizer): set `mode_max_length=4096`
e1520e0
verified
8 months ago
.gitattributes
Safe
1.52 kB
initial commit
11 months ago
LICENSE
Safe
7.45 kB
Upload LICENSE
11 months ago
README.md
Safe
7.84 kB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
10 months ago
config.json
Safe
608 Bytes
revert(config): use `float16` torch dtype
10 months ago
configuration_stablelm.py
Safe
9.07 kB
merge: upload transformers implementation (#14)
10 months ago
generation_config.json
Safe
121 Bytes
merge: upload transformers implementation (#14)
10 months ago
merges.txt
Safe
917 kB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
10 months ago
model.safetensors
Safe
3.29 GB
LFS
Upload StableLMEpochForCausalLM
11 months ago
modeling_stablelm.py
Safe
63.1 kB
merge: upload transformers implementation (#14)
10 months ago
special_tokens_map.json
Safe
784 Bytes
update(tokenizer): convert to `GPT2Tokenizer` (#15)
10 months ago
stablelm-2-zephyr-1_6b-OpenVINO-4bit.bin
Safe
1.05 GB
LFS
OpenVINO NNCF 4BIT quantization
11 months ago
stablelm-2-zephyr-1_6b-OpenVINO-4bit.xml
Safe
2.89 MB
LFS
OpenVINO NNCF 4BIT quantization
11 months ago
stablelm-2-zephyr-1_6b-Q4_0.gguf
Safe
983 MB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
11 months ago
stablelm-2-zephyr-1_6b-Q4_1.gguf
Safe
1.07 GB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
11 months ago
stablelm-2-zephyr-1_6b-Q5_K_M.gguf
Safe
1.19 GB
LFS
GGUF Q5_K_M quantize
11 months ago
stablelm-2-zephyr-1_6b-Q8_0.gguf
Safe
1.75 GB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
11 months ago
stablelm-2-zephyr-1_6b.gguf
Safe
3.29 GB
LFS
FP16 GGUF file
11 months ago
tokenizer.json
Safe
4.24 MB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
10 months ago
tokenizer_config.json
Safe
1.4 kB
fix(tokenizer): set `mode_max_length=4096`
8 months ago
vocab.json
Safe
2.01 MB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
10 months ago