---
license: mit
datasets:
- pankajmathur/orca_mini_v1_dataset
- pankajmathur/orca_mini_v8_sharegpt_format
language:
- en
base_model:
- pankajmathur/orca_mini_phi-4
library_name: transformers
---
# Model Name: orca_mini_phi-4-GGUF
**orca_mini_phi-4-GGUF is static quant version of [orca_mini_phi-4](https://huggingface.co/pankajmathur/orca_mini_phi-4)**
"Obsessed with Open Source GenAI's potential? So am I ! Let's Contribute together 🚀 https://www.linkedin.com/in/pankajam"
### NOTICE
By providing proper credit and attribution, you are granted permission to use this model as a foundational base for further Full fine tuning, DPO, PPO or ORPO tuning and any kind of Merges.
I actively encourage users to customize and enhance the model according to their specific needs, as this version is designed to be a comprehensive general model.
Dive in and innovate!
### Example Usage on Your Personal Computer
My preferred method is Ollama but you can choose other ways to use this modle locally on your PC by going to right side of this repo card and select "Use this model".
Here are the steps for Ollama:
1. Install Ollama https://github.com/ollama/ollama
2. After Installing Ollama (choose various available quants size, depending upon your PC RAM) run below command on your terminal, for example for Nov 2023 MacBook Pro with 16GB RAM:
```shell
ollama run hf.co/pankajmathur/orca_mini_phi-4-GGUF:Q4_K_S
```
4. Start Chatting with this model!
5. Optional: For ChatGPT like experience (or better), Install Open-Webui https://github.com/open-webui/open-webui
```shell
pip install open-webui
open-webui serve
```
6. open-webui will auto connect to Ollama running this model and you can start chatting right away