metadata
license: apache-2.0
inference: false
dolphin-2.9.3-mistral-7b-32k-ov
dolphin-2.9.3-mistral-7b-32k-ov is an OpenVino int4 quantized version of Dolphin 2.9.3 Mistral 7B, providing a very fast, very small inference implementation, optimized for AI PCs using Intel GPU, CPU and NPU.
llmware/dolphin-2.9.3-mistral-7b-32k is a leading dialog chat finetune of Mistral 7B.
Get started right away with OpenVino
Looking for AI PC solutions and demos, contact us at llmware
Model Description
- Developed by: cognitivecomputations
- Model type: mistral
- Parameters: 7 billion
- Model Parent: cognitivecomputations/dolphin-2.9.3-mistral-7b-32k-ov
- Language(s) (NLP): English
- License: Apache 2.0
- Uses: General purpose dialog and chat
- RAG Benchmark Accuracy Score: NA
- Quantization: int4