|
--- |
|
license: apache-2.0 |
|
tags: |
|
- moe |
|
- merge |
|
- mergekit |
|
- lazymergekit |
|
- phi3_mergekit |
|
- microsoft/Phi-3-mini-4k-instruct |
|
base_model: |
|
- microsoft/Phi-3-mini-4k-instruct |
|
- microsoft/Phi-3-mini-4k-instruct |
|
--- |
|
|
|
# Phi3Mix |
|
|
|
Phi3Mix is a Mixture of Experts (MoE) made with the following models using [Phi3_LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): |
|
* [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) |
|
* [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) |
|
|
|
## 🧩 Configuration |
|
|
|
```yaml |
|
base_model: microsoft/Phi-3-mini-4k-instruct |
|
gate_mode: cheap_embed |
|
experts_per_token: 1 |
|
dtype: float16 |
|
experts: |
|
- source_model: microsoft/Phi-3-mini-4k-instruct |
|
positive_prompts: ["research, logic, math, science"] |
|
- source_model: microsoft/Phi-3-mini-4k-instruct |
|
positive_prompts: ["creative, art"] |
|
``` |
|
|
|
## 💻 Usage |
|
|
|
```python |
|
import torch |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
model = "HassanStar/Phi3Mix" |
|
|
|
tokenizer = AutoTokenizer.from_pretrained(model) |
|
|
|
model = AutoModelForCausalLM.from_pretrained( |
|
model, |
|
trust_remote_code=True, |
|
) |
|
|
|
prompt="How many continents are there?" |
|
input = f"<|system|>You are a helpful AI assistant.<|end|><|user|>{prompt}<|assistant|>" |
|
tokenized_input = tokenizer.encode(input, return_tensors="pt") |
|
|
|
outputs = model.generate(tokenized_input, max_new_tokens=128, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) |
|
print(tokenizer.decode(outputs[0])) |
|
``` |