File size: 1,274 Bytes
47f50d0 1d9bc8f bc08efe 1d9bc8f bc08efe 1d9bc8f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
---
license: apache-2.0
datasets:
- CycloneDX/cdx-docs
language:
- en
base_model:
- unsloth/phi-4
library_name: mlx
tags:
- cyclonedx
- cdxgen
- sbom
- security
- purl
- obom
- ml-bom
- cbom
---
**PREVIEW RELEASE**
## Testing with LM Studio
Use [LM Studio](https://lmstudio.ai/docs/basics/download-model) to download and test this model. Search for `CycloneDX/cdx1-mlx` (Full version) or `CycloneDX/cdx1-mlx-8bit` (Recommended).
Use the below configurations:
```
System Prompt: `You are a helpful assistant to the user.` Use Prompt [Template](https://lmstudio.ai/docs/advanced/prompt-template).
Temperature: 0.05
Max tokens: 8192 or 16000
Context length: 16000
```
## Testing with mlx
Install [miniconda](https://docs.anaconda.com/miniconda/install/) or Python 3.11
```
conda create --name cdx1-mlx python=3.11
conda activate cdx1-mlx
pip install mlx
```
LLM Inference from the CLI.
```shell
mlx_lm.generate --model CycloneDX/cdx1-mlx --system-prompt "You are a helpful assistant to the user." --prompt "tell me about cdxgen" --temp 0.05
```
Use the 8 bit version for better speed and performance.
```shell
mlx_lm.generate --model CycloneDX/cdx1-mlx-8bit --system-prompt "You are a helpful assistant to the user." --prompt "tell me about cdxgen" --temp 0.05
```
|