Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ikawrakow
/
various-2bit-sota-gguf
like
80
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
13
Deploy
Use this model
main
various-2bit-sota-gguf
1 contributor
History:
13 commits
ikawrakow
Adding Nous-Hermes 2.31 bpw quantized models
6d5bc07
12 months ago
.gitattributes
Safe
1.56 kB
Adding first set of models
12 months ago
README.md
Safe
384 Bytes
Update README.md
12 months ago
llama-v2-13b-2.17bpw.gguf
Safe
3.54 GB
LFS
Adding first set of models
12 months ago
llama-v2-13b-2.39bpw.gguf
Safe
3.89 GB
LFS
Adding 2.31-bpw base quantized models
12 months ago
llama-v2-70b-2.12bpw.gguf
Safe
18.3 GB
LFS
Adding more
12 months ago
llama-v2-70b-2.36bpw.gguf
Safe
20.3 GB
LFS
Adding 2.31-bpw base quantized models
12 months ago
llama-v2-7b-2.20bpw.gguf
Safe
1.85 GB
LFS
Adding first set of models
12 months ago
llama-v2-7b-2.42bpw.gguf
Safe
2.03 GB
LFS
Adding 2.31-bpw base quantized models
12 months ago
mistral-7b-2.20bpw.gguf
Safe
1.99 GB
LFS
Adding first set of models
12 months ago
mistral-7b-2.43bpw.gguf
Safe
2.2 GB
LFS
Adding 2.31-bpw base quantized models
12 months ago
mistral-instruct-7b-2.43bpw.gguf
Safe
2.2 GB
LFS
Adding Mistral instruct models
12 months ago
mixtral-8x7b-2.10bpw.gguf
Safe
12.3 GB
LFS
Adding Mixtral-8x7b
12 months ago
mixtral-8x7b-2.34bpw.gguf
Safe
13.7 GB
LFS
Adding 2.31-bpw base quantized models
12 months ago
mixtral-instruct-8x7b-2.10bpw.gguf
Safe
12.3 GB
LFS
Adding Mixtral-instruct-8x7b
12 months ago
mixtral-instruct-8x7b-2.34bpw.gguf
Safe
13.7 GB
LFS
Adding Mistral instruct models
12 months ago
nous-hermes-2-10.7b-2.18bpw.gguf
Safe
2.92 GB
LFS
Adding Nous-Hermes-2-SOLAR-10.7B 2-bit quants
12 months ago
nous-hermes-2-10.7b-2.41bpw.gguf
Safe
3.23 GB
LFS
Adding Nous-Hermes 2.31 bpw quantized models
12 months ago
nous-hermes-2-10.7b-2.70bpw.gguf
Safe
3.62 GB
LFS
Adding Nous-Hermes-2-SOLAR-10.7B 2-bit quants
12 months ago
nous-hermes-2-34b-2.16bpw.gguf
Safe
9.31 GB
LFS
Adding Nous-Hermes-2-Yi-34B 2-bit quants
12 months ago
nous-hermes-2-34b-2.40bpw.gguf
Safe
10.3 GB
LFS
Adding Nous-Hermes 2.31 bpw quantized models
12 months ago
nous-hermes-2-34b-2.69bpw.gguf
Safe
11.6 GB
LFS
Adding Nous-Hermes-2-Yi-34B 2-bit quants
12 months ago
rocket-3b-2.31bpw.gguf
Safe
808 MB
LFS
Adding Rocket-3b 2-bit quants
12 months ago
rocket-3b-2.76bpw.gguf
Safe
967 MB
LFS
Adding Rocket-3b 2-bit quants
12 months ago