Owhslp commited on
Commit
d70c39c
·
verified ·
1 Parent(s): 2f11eaf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -63
README.md CHANGED
@@ -1,63 +1 @@
1
- ---
2
- tags:
3
- - merge
4
- - mergekit
5
- - lazymergekit
6
- - lgodwangl/new_10m
7
- - coffiee/m28
8
- base_model:
9
- - lgodwangl/new_10m
10
- - coffiee/m28
11
- ---
12
-
13
- # mr_7
14
-
15
- mr_7 is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
16
- * [lgodwangl/new_10m](https://huggingface.co/lgodwangl/new_10m)
17
- * [coffiee/m28](https://huggingface.co/coffiee/m28)
18
-
19
- ## 🧩 Configuration
20
-
21
- ```yaml
22
- slices:
23
- - sources:
24
- - model: lgodwangl/new_10m
25
- layer_range: [0, 32]
26
- - model: coffiee/m28
27
- layer_range: [0, 32]
28
- merge_method: slerp
29
- base_model: lgodwangl/new_10m
30
- parameters:
31
- t:
32
- - filter: self_attn
33
- value: [0, 0.5, 0.3, 0.7, 1]
34
- - filter: mlp
35
- value: [1, 0.5, 0.7, 0.3, 0]
36
- - value: 0.5
37
- dtype: bfloat16
38
- ```
39
-
40
- ## 💻 Usage
41
-
42
- ```python
43
- !pip install -qU transformers accelerate
44
-
45
- from transformers import AutoTokenizer
46
- import transformers
47
- import torch
48
-
49
- model = "Owhslp/mr_7"
50
- messages = [{"role": "user", "content": "What is a large language model?"}]
51
-
52
- tokenizer = AutoTokenizer.from_pretrained(model)
53
- prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
54
- pipeline = transformers.pipeline(
55
- "text-generation",
56
- model=model,
57
- torch_dtype=torch.float16,
58
- device_map="auto",
59
- )
60
-
61
- outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
62
- print(outputs[0]["generated_text"])
63
- ```
 
1
+ a