File size: 2,205 Bytes
7b4609d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
license: apache-2.0
widget:
- example_title: Yi-34B-Chat
  text: hi
  output:
    text: ' Hello! How can I assist you today?'
- example_title: Yi-34B
  text: There's a place where time stands still. A place of breath taking wonder,
    but also
  output:
    text: ' an eerie sense that something is just not right…

      Between the two worlds lies The Forgotten Kingdom - home to creatures long since
      thought extinct and ancient magic so strong it defies belief! Only here can
      you find what has been lost for centuries: An Elixir Of Life which will restore
      youth and vitality if only those who seek its power are brave enough to face
      up against all manner of dangers lurking in this mysterious land! But beware;
      some say there may even exist powerful entities beyond our comprehension whose
      intentions towards humanity remain unclear at best ---- they might want nothing
      more than destruction itself rather then anything else from their quest after
      immortality (and maybe someone should tell them about modern medicine)? In any
      event though – one thing remains true regardless : whether or not success comes
      easy depends entirely upon how much effort we put into conquering whatever challenges
      lie ahead along with having faith deep down inside ourselves too ;) So let’s
      get started now shall We?'
pipeline_tag: text-generation
base_model: 01-ai/Yi-6B-Chat
tags:
- mlx
---

# mlx-community/Yi-6B-Chat-4bit

The Model [mlx-community/Yi-6B-Chat-4bit](https://huggingface.co/mlx-community/Yi-6B-Chat-4bit) was converted to MLX format from [01-ai/Yi-6B-Chat](https://huggingface.co/01-ai/Yi-6B-Chat) using mlx-lm version **0.19.2**.

## Use with mlx

```bash
pip install mlx-lm
```

```python
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Yi-6B-Chat-4bit")

prompt="hello"

if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
    messages = [{"role": "user", "content": prompt}]
    prompt = tokenizer.apply_chat_template(
        messages, tokenize=False, add_generation_prompt=True
    )

response = generate(model, tokenizer, prompt=prompt, verbose=True)
```