Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Dracones
/
perky-70b-v0.1-GGUF
like
3
GGUF
English
Not-For-All-Audiences
arxiv:
2203.05482
License:
llama2
Model card
Files
Files and versions
Community
Deploy
Use this model
main
perky-70b-v0.1-GGUF
1 contributor
History:
7 commits
Dracones
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
f166973
verified
about 1 year ago
.gitattributes
Safe
2.59 kB
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
about 1 year ago
Perky.card.png
Safe
1.72 MB
LFS
Upload folder using huggingface_hub
about 1 year ago
README.md
Safe
6.68 kB
Upload README.md with huggingface_hub
about 1 year ago
perky-70b-v0.1-IQ3_XXS.gguf
28.2 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q2_K.gguf
25.5 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q3_K_L.gguf
36.1 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q3_K_M.gguf
33.3 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q3_K_S.gguf
29.9 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q3_K_XS.gguf
28.3 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q4_0.gguf
38.9 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q4_K_M.gguf
41.4 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q4_K_S.gguf
39.2 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q5_0.gguf
47.5 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q5_K_M.gguf
48.8 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q5_K_S.gguf
47.5 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
perky-70b-v0.1-Q6_K.gguf-part-a
32.2 GB
LFS
Upload perky-70b-v0.1-Q6_K.gguf-part-a with huggingface_hub
about 1 year ago
perky-70b-v0.1-Q6_K.gguf-part-b
24.4 GB
LFS
Upload perky-70b-v0.1-Q6_K.gguf-part-b with huggingface_hub
about 1 year ago
perky-70b-v0.1-Q8_0.gguf-part-a
42.9 GB
LFS
Upload perky-70b-v0.1-Q8_0.gguf-part-a with huggingface_hub
about 1 year ago
perky-70b-v0.1-Q8_0.gguf-part-b
30.3 GB
LFS
Upload perky-70b-v0.1-Q8_0.gguf-part-b with huggingface_hub
about 1 year ago