decisionslab/Dlab-852-Mini-Preview-4-bit
The Model decisionslab/Dlab-852-Mini-Preview-4-bit was converted to MLX format from microsoft/Phi-4 using mlx-lm version 0.21.5.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("decisionslab/Dlab-852-Mini-Preview-4-bit")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
License
All content in this repository is proprietary and confidential. The software and any associated documentation are the exclusive property of Decisions Lab. Unauthorized copying, distribution, modification, or use via any medium is strictly prohibited. Use of this software requires explicit permission from Decisions Lab.
© 2025 Decisions Lab. All rights reserved.
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.