Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,7 @@ This is a direct extraction of the 8 experts from [Mixtral-8x7b-Instruct-v0.1](h
|
|
13 |
- **Expert Configuration:** It is 2 experts per token.
|
14 |
- **Performance:** Performance is identical to instruct, if not a little better.
|
15 |
- **Evaluations:** Evals will come, it is more malleable to training.
|
16 |
-
- **Experimentation:** This is the first of a few MoE expert extraction and modification projects we're working on, more to come.
|
17 |
-
- Enjoy.
|
18 |
|
19 |
## Instruction Format
|
20 |
To leverage instruction fine-tuning, your prompts should be enclosed with `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin-of-sentence id, while subsequent instructions should not. Assistant generation will conclude with an end-of-sentence token id.
|
|
|
13 |
- **Expert Configuration:** It is 2 experts per token.
|
14 |
- **Performance:** Performance is identical to instruct, if not a little better.
|
15 |
- **Evaluations:** Evals will come, it is more malleable to training.
|
16 |
+
- **Experimentation:** This is the first of a few MoE expert extraction and modification projects we're working on, more to come. Enjoy.
|
|
|
17 |
|
18 |
## Instruction Format
|
19 |
To leverage instruction fine-tuning, your prompts should be enclosed with `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin-of-sentence id, while subsequent instructions should not. Assistant generation will conclude with an end-of-sentence token id.
|