Update README.md
Browse files
README.md
CHANGED
@@ -16,4 +16,39 @@ tags:
|
|
16 |
- obom
|
17 |
- ml-bom
|
18 |
- cbom
|
19 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
- obom
|
17 |
- ml-bom
|
18 |
- cbom
|
19 |
+
---
|
20 |
+
|
21 |
+
**PREVIEW RELEASE**
|
22 |
+
|
23 |
+
## Testing with LM Studio
|
24 |
+
|
25 |
+
Use [LM Studio](https://lmstudio.ai/docs/basics/download-model) to download and test this model. Search for `CycloneDX/cdx1-mlx` (Full version) or `CycloneDX/cdx1-mlx-8bit` (Recommended).
|
26 |
+
|
27 |
+
Use the below configurations:
|
28 |
+
|
29 |
+
System Prompt: `You are a helpful assistant to the user.` Use Prompt [Template](https://lmstudio.ai/docs/advanced/prompt-template).
|
30 |
+
Temperature: 0.05
|
31 |
+
Max tokens: 8192 or 16000
|
32 |
+
Context length: 16000
|
33 |
+
|
34 |
+
## Testing with mlx
|
35 |
+
|
36 |
+
Install [miniconda](https://docs.anaconda.com/miniconda/install/) or Python 3.11
|
37 |
+
|
38 |
+
```
|
39 |
+
conda create --name cdx1-mlx python=3.11
|
40 |
+
conda activate cdx1-mlx
|
41 |
+
pip install mlx
|
42 |
+
```
|
43 |
+
|
44 |
+
LLM Inference from the CLI.
|
45 |
+
|
46 |
+
```shell
|
47 |
+
mlx_lm.generate --model CycloneDX/cdx1-mlx --system-prompt "You are a helpful assistant to the user." --prompt "tell me about cdxgen" --temp 0.05
|
48 |
+
```
|
49 |
+
|
50 |
+
Use the 8 bit version for better speed and performance.
|
51 |
+
|
52 |
+
```shell
|
53 |
+
mlx_lm.generate --model CycloneDX/cdx1-mlx-8bit --system-prompt "You are a helpful assistant to the user." --prompt "tell me about cdxgen" --temp 0.05
|
54 |
+
```
|