mzbac commited on
Commit
2f818c7
·
1 Parent(s): 7ac55fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -30
README.md CHANGED
@@ -2,42 +2,17 @@
2
  license: other
3
  tags:
4
  - mlx
5
- license_name: yi-license
6
  license_link: LICENSE
7
- widget:
8
- - example_title: Yi-34B-Chat
9
- text: hi
10
- output:
11
- text: ' Hello! How can I assist you today?'
12
- - example_title: Yi-34B
13
- text: There's a place where time stands still. A place of breath taking wonder,
14
- but also
15
- output:
16
- text: ' an eerie sense that something is just not right…
17
-
18
- Between the two worlds lies The Forgotten Kingdom - home to creatures long since
19
- thought extinct and ancient magic so strong it defies belief! Only here can
20
- you find what has been lost for centuries: An Elixir Of Life which will restore
21
- youth and vitality if only those who seek its power are brave enough to face
22
- up against all manner of dangers lurking in this mysterious land! But beware;
23
- some say there may even exist powerful entities beyond our comprehension whose
24
- intentions towards humanity remain unclear at best ---- they might want nothing
25
- more than destruction itself rather then anything else from their quest after
26
- immortality (and maybe someone should tell them about modern medicine)? In any
27
- event though – one thing remains true regardless : whether or not success comes
28
- easy depends entirely upon how much effort we put into conquering whatever challenges
29
- lie ahead along with having faith deep down inside ourselves too ;) So let’s
30
- get started now shall We?'
31
- pipeline_tag: text-generation
32
  ---
33
 
34
- # Yi-34B-Chat-hf-4bit-mlx
35
- This model was converted to MLX format from [`01-ai/Yi-34B-Chat`]().
36
- Refer to the [original model card](https://huggingface.co/01-ai/Yi-34B-Chat) for more details on the model.
37
  ## Use with mlx
38
  ```bash
39
  pip install mlx
40
  git clone https://github.com/ml-explore/mlx-examples.git
41
  cd mlx-examples/llms/hf_llm
42
- python generate.py --model mlx-community/Yi-34B-Chat-hf-4bit-mlx --prompt "My name is"
43
  ```
 
2
  license: other
3
  tags:
4
  - mlx
5
+ license_name: deepseek
6
  license_link: LICENSE
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  ---
8
 
9
+ # deepseek-coder-6.7b-instruct-hf-4bit-mlx
10
+ This model was converted to MLX format from [`deepseek-ai/deepseek-coder-6.7b-instruct`]().
11
+ Refer to the [original model card](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) for more details on the model.
12
  ## Use with mlx
13
  ```bash
14
  pip install mlx
15
  git clone https://github.com/ml-explore/mlx-examples.git
16
  cd mlx-examples/llms/hf_llm
17
+ python generate.py --model mlx-community/deepseek-coder-6.7b-instruct-hf-4bit-mlx --prompt "### Instruction: \nwrite a quick sort algorithm in python.\n### Response: \n"
18
  ```