Update README.md
Browse files
README.md
CHANGED
@@ -30,18 +30,16 @@ Hermes-2-Pro-Mixtral-4x7B is a Mixure of Experts (MoE) made with the following m
|
|
30 |
experts:
|
31 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
32 |
positive_prompts:
|
33 |
-
- ""
|
34 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
35 |
positive_prompts:
|
36 |
-
- ""
|
37 |
-
|
38 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
39 |
positive_prompts:
|
40 |
-
- ""
|
41 |
-
|
42 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
43 |
positive_prompts:
|
44 |
-
- ""
|
45 |
```
|
46 |
|
47 |
## 💻 Usage
|
|
|
30 |
experts:
|
31 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
32 |
positive_prompts:
|
33 |
+
- " "
|
34 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
35 |
positive_prompts:
|
36 |
+
- " "
|
|
|
37 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
38 |
positive_prompts:
|
39 |
+
- " "
|
|
|
40 |
- source_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
41 |
positive_prompts:
|
42 |
+
- " "
|
43 |
```
|
44 |
|
45 |
## 💻 Usage
|