Model Name: orca_mini_phi-4-GGUF
orca_mini_phi-4-GGUF is static quant version of orca_mini_phi-4
"Obsessed with Open Source GenAI's potential? So am I ! Let's Contribute together 🚀 https://www.linkedin.com/in/pankajam"NOTICE
By providing proper credit and attribution, you are granted permission to use this model as a foundational base for further Full fine tuning, DPO, PPO or ORPO tuning and any kind of Merges. I actively encourage users to customize and enhance the model according to their specific needs, as this version is designed to be a comprehensive general model. Dive in and innovate!
Example Usage on Your Personal Computer
My preferred method is Ollama but you can choose other ways to use this modle locally on your PC by going to right side of this repo card and select "Use this model".
Here are the steps for Ollama:
- Install Ollama https://github.com/ollama/ollama
- After Installing Ollama (choose various available quants size, depending upon your PC RAM) run below command on your terminal, for example for Nov 2023 MacBook Pro with 16GB RAM:
ollama run hf.co/pankajmathur/orca_mini_phi-4-GGUF:Q4_K_S
- Start Chatting with this model!
- Optional: For ChatGPT like experience (or better), Install Open-Webui https://github.com/open-webui/open-webui
pip install open-webui
open-webui serve
- open-webui will auto connect to Ollama running this model and you can start chatting right away
- Downloads last month
- 127
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.