prince-canuma's picture
67ab88cdb19f9716ad70466ee08ae158b5d55434c5d3dd19aa1709e2ce392ef1
c2127e5 verified
|
raw
history blame
570 Bytes
---
license: apache-2.0
tags:
- mlx
---
# mlx-community/mathstral-7B-v0.1-fp16
The Model [mlx-community/mathstral-7B-v0.1-fp16](https://huggingface.co/mlx-community/mathstral-7B-v0.1-fp16) was converted to MLX format from [mistralai/mathstral-7B-v0.1](https://huggingface.co/mistralai/mathstral-7B-v0.1) using mlx-lm version **0.15.2**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/mathstral-7B-v0.1-fp16")
response = generate(model, tokenizer, prompt="hello", verbose=True)
```