darkproger
commited on
Commit
•
96f9287
1
Parent(s):
f73e687
Update README.md
Browse files
README.md
CHANGED
@@ -34,7 +34,8 @@ model-index:
|
|
34 |
---
|
35 |
|
36 |
# lang-uk/dragoman-4bit
|
37 |
-
This model was converted to MLX format from [`lang-uk/dragoman`]()
|
|
|
38 |
Refer to the [original model card](https://huggingface.co/lang-uk/dragoman) for more details on the model.
|
39 |
## Use with mlx
|
40 |
|
@@ -48,3 +49,9 @@ from mlx_lm import load, generate
|
|
48 |
model, tokenizer = load("lang-uk/dragoman-4bit")
|
49 |
response = generate(model, tokenizer, prompt="hello", verbose=True)
|
50 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
34 |
---
|
35 |
|
36 |
# lang-uk/dragoman-4bit
|
37 |
+
This model was converted to MLX format from the [`lang-uk/dragoman`](https://huggingface.co/lang-uk/dragoman) adapter fused into the [`mistralai/Mistral-7b-v0.1`](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
38 |
+
base model and quantized into 4 bits using mlx-lm version **0.4.0**.
|
39 |
Refer to the [original model card](https://huggingface.co/lang-uk/dragoman) for more details on the model.
|
40 |
## Use with mlx
|
41 |
|
|
|
49 |
model, tokenizer = load("lang-uk/dragoman-4bit")
|
50 |
response = generate(model, tokenizer, prompt="hello", verbose=True)
|
51 |
```
|
52 |
+
|
53 |
+
Or use from your shell:
|
54 |
+
|
55 |
+
```console
|
56 |
+
python -m mlx_lm.generate --model lang-uk/dragoman-4bit --prompt '[INST] who holds this neighborhood? [/INST]' --temp 0 --max-tokens 100
|
57 |
+
```
|