Convert from microsoft/phi-1_5 and 4 bits quantized.

Require onnxruntime>=0.17.0

Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support transformers.js models with pipeline type text-generation

Model tree for BricksDisplay/phi-1_5-q4

Base model

microsoft/phi-1_5
Quantized
(12)
this model

Collection including BricksDisplay/phi-1_5-q4