k2rks's picture
ab2fd6b3a2f182ad74e16f578b30078efcb83fb53ba76682405514bc77913833
35e3c59 verified
|
raw
history blame
583 Bytes
metadata
library_name: transformers
license: apache-2.0
tags:
  - mlx

mlx-community/Yi-Coder-9B-Chat-4bit

The Model mlx-community/Yi-Coder-9B-Chat-4bit was converted to MLX format from 01-ai/Yi-Coder-9B-Chat using mlx-lm version 0.18.1.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Yi-Coder-9B-Chat-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)