Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mediocreatmybest
/
WizardCoder-Python-13B-V1.0_8bit_nf4
like
0
Text Generation
Transformers
PyTorch
Safetensors
llama
code
Eval Results
text-generation-inference
Inference Endpoints
8-bit precision
bitsandbytes
arxiv:
4 papers
License:
llama2
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
WizardCoder-Python-13B-V1.0_8bit_nf4
1 contributor
History:
5 commits
Mediocreatmybest
Create README.md
ca6f57b
over 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
over 1 year ago
README.md
Safe
9.22 kB
Create README.md
over 1 year ago
added_tokens.json
Safe
21 Bytes
Upload tokenizer
over 1 year ago
config.json
Safe
1.05 kB
Upload LlamaForCausalLM
over 1 year ago
generation_config.json
Safe
116 Bytes
Upload LlamaForCausalLM
over 1 year ago
model-00001-of-00002.safetensors
Safe
9.96 GB
LFS
Adding `safetensors` variant of this model (#1)
over 1 year ago
model-00002-of-00002.safetensors
Safe
3.4 GB
LFS
Adding `safetensors` variant of this model (#1)
over 1 year ago
model.safetensors.index.json
Safe
54.4 kB
Adding `safetensors` variant of this model (#1)
over 1 year ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (5)
"torch.FloatStorage"
,
"torch.CharStorage"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
9.96 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (5)
"torch.FloatStorage"
,
"torch.CharStorage"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
3.4 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model.bin.index.json
Safe
51.8 kB
Upload LlamaForCausalLM
over 1 year ago
special_tokens_map.json
Safe
96 Bytes
Upload tokenizer
over 1 year ago
tokenizer.json
Safe
1.84 MB
Upload tokenizer
over 1 year ago
tokenizer.model
Safe
500 kB
LFS
Upload tokenizer
over 1 year ago
tokenizer_config.json
Safe
758 Bytes
Upload tokenizer
over 1 year ago