prince-canuma's picture
67ab88cdb19f9716ad70466ee08ae158b5d55434c5d3dd19aa1709e2ce392ef1
c2127e5 verified
|
raw
history blame
570 Bytes
metadata
license: apache-2.0
tags:
  - mlx

mlx-community/mathstral-7B-v0.1-fp16

The Model mlx-community/mathstral-7B-v0.1-fp16 was converted to MLX format from mistralai/mathstral-7B-v0.1 using mlx-lm version 0.15.2.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/mathstral-7B-v0.1-fp16")
response = generate(model, tokenizer, prompt="hello", verbose=True)