heyuan commited on
Commit
f097d44
·
verified ·
1 Parent(s): a0a270e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -3
README.md CHANGED
@@ -18,9 +18,14 @@ Rose-2x7B is a Mixure of Experts (MoE) made with the following models using [Mer
18
  * [maywell/PiVoT-0.1-Starling-LM-RP](https://huggingface.co/maywell/PiVoT-0.1-Starling-LM-RP)
19
  * [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
20
 
 
 
 
 
21
  ## 🧩 Configuration
22
 
23
- ```yamlbase_model: uproai/ros-7b-v1
 
24
  experts:
25
  - source_model: maywell/PiVoT-0.1-Starling-LM-RP
26
  positive_prompts:
@@ -37,8 +42,7 @@ experts:
37
  - "solve"
38
  - "count"
39
  tokenizer_source: union
40
-
41
- #mergekit-moe mergekit_moe.yaml merge --copy-tokenizer --device cuda --low-cpu-memory```
42
 
43
  ## 💻 Usage
44
 
 
18
  * [maywell/PiVoT-0.1-Starling-LM-RP](https://huggingface.co/maywell/PiVoT-0.1-Starling-LM-RP)
19
  * [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
20
 
21
+ ```bash
22
+ mergekit-moe mergekit_moe.yaml merge --copy-tokenizer --device cuda --low-cpu-memory
23
+ ```
24
+
25
  ## 🧩 Configuration
26
 
27
+ ```yaml
28
+ base_model: uproai/ros-7b-v1
29
  experts:
30
  - source_model: maywell/PiVoT-0.1-Starling-LM-RP
31
  positive_prompts:
 
42
  - "solve"
43
  - "count"
44
  tokenizer_source: union
45
+ ```
 
46
 
47
  ## 💻 Usage
48