Update README.md
Browse files
README.md
CHANGED
@@ -14,9 +14,7 @@ license: apache-2.0
|
|
14 |
Just one day after the release of **Mixtral-8x-22b**, we are excited to introduce our handcrafted experimental model, **Mistral-22b-V.01**. This model is a culmination of equal knowledge distilled from all experts into a single, dense 22b model. This model is not a single trained expert, rather its a compressed MOE model, turning it into a dense 22b mode. This is the first working MOE to Dense model conversion.
|
15 |
|
16 |
### Capabilities
|
17 |
-
- **Math Proficiency**: The model exhibits strong mathematical abilities.
|
18 |
-
- **Reasoning**: It possesses decent reasoning skills.
|
19 |
-
- **User Interaction**: It responds effectively to user prompts.
|
20 |
|
21 |
### Experimental Nature
|
22 |
Please note that Mistral-22b-V.01 is an experimental model. It has been fine-tuned with fewer examples compared to the model set for release tomorrow. We encourage you to explore its capabilities and provide feedback.
|
|
|
14 |
Just one day after the release of **Mixtral-8x-22b**, we are excited to introduce our handcrafted experimental model, **Mistral-22b-V.01**. This model is a culmination of equal knowledge distilled from all experts into a single, dense 22b model. This model is not a single trained expert, rather its a compressed MOE model, turning it into a dense 22b mode. This is the first working MOE to Dense model conversion.
|
15 |
|
16 |
### Capabilities
|
17 |
+
- **Math Proficiency**: The model exhibits strong mathematical abilities. Dispite not being trained on math.
|
|
|
|
|
18 |
|
19 |
### Experimental Nature
|
20 |
Please note that Mistral-22b-V.01 is an experimental model. It has been fine-tuned with fewer examples compared to the model set for release tomorrow. We encourage you to explore its capabilities and provide feedback.
|