Update README.md
Browse files
README.md
CHANGED
@@ -20,10 +20,10 @@ model-index:
|
|
20 |
---
|
21 |
|
22 |
# cogbuji/OpenHermes-2.5-Mistral-7B-mlx-4bit
|
23 |
-
This model was converted to MLX format from [teknium/OpenHermes-2.5-Mistral-7B](/teknium/OpenHermes-2.5-Mistral-7B) and quantized
|
24 |
Refer to the [original model card](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) for more details on the model.
|
25 |
|
26 |
-
It was converted and quantized with mlx 0.7.0 and mlx_lm 0.3 and should be used with those versions. Later versions of these may deprecate this model
|
27 |
## Use with mlx
|
28 |
|
29 |
```bash
|
|
|
20 |
---
|
21 |
|
22 |
# cogbuji/OpenHermes-2.5-Mistral-7B-mlx-4bit
|
23 |
+
This model was converted to MLX format from [teknium/OpenHermes-2.5-Mistral-7B](/teknium/OpenHermes-2.5-Mistral-7B) and quantized.
|
24 |
Refer to the [original model card](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) for more details on the model.
|
25 |
|
26 |
+
It was converted and quantized with mlx **0.7.0** and mlx_lm **0.3.0** and should be used with those versions. Later versions of these may deprecate this model
|
27 |
## Use with mlx
|
28 |
|
29 |
```bash
|