Downloads last month
194
Safetensors
Model size
1.07B params
Tensor type
F32
·
FP16
·
U8
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Model tree for PrunaAI/internlm-internlm2-chat-1_8b-sft-bnb-4bit-smashed

Quantized
(4)
this model

Collection including PrunaAI/internlm-internlm2-chat-1_8b-sft-bnb-4bit-smashed