|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- CycloneDX/cdx-docs |
|
language: |
|
- en |
|
base_model: |
|
- unsloth/phi-4 |
|
library_name: mlx |
|
tags: |
|
- cyclonedx |
|
- cdxgen |
|
- sbom |
|
- security |
|
- purl |
|
- obom |
|
- ml-bom |
|
- cbom |
|
--- |
|
|
|
**PREVIEW RELEASE** |
|
|
|
## Testing with LM Studio |
|
|
|
Use [LM Studio](https://lmstudio.ai/docs/basics/download-model) to download and test this model. Search for `CycloneDX/cdx1-mlx-6bit`. |
|
|
|
Use the below configurations: |
|
|
|
``` |
|
System Prompt: `You are a helpful assistant to the user.` Use Prompt [Template](https://lmstudio.ai/docs/advanced/prompt-template). |
|
Temperature: 0.05 |
|
Max tokens: 8192 or 16000 |
|
Context length: 16000 |
|
``` |
|
|
|
## Testing with mlx |
|
|
|
Install [miniconda](https://docs.anaconda.com/miniconda/install/) or Python 3.11 |
|
|
|
``` |
|
conda create --name cdx1-mlx python=3.11 |
|
conda activate cdx1-mlx |
|
pip install mlx |
|
``` |
|
|
|
LLM Inference from the CLI. |
|
|
|
```shell |
|
mlx_lm.generate --model CycloneDX/cdx1-mlx-6bit --system-prompt "You are a helpful assistant to the user." --prompt "tell me about cdxgen" --temp 0.05 |
|
``` |