Andyrasika
commited on
Commit
•
d18b82b
1
Parent(s):
ca97e6f
Update README.md
Browse files
README.md
CHANGED
@@ -33,167 +33,71 @@ This is the model card of a 🤗 transformers model that has been pushed on the
|
|
33 |
- **Finetuned from model [optional]:** https://huggingface.co/OpenPipe/mistral-ft-optimized-1227
|
34 |
|
35 |
|
36 |
-
##
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
<!-- This section describes the evaluation protocols and provides the results. -->
|
106 |
-
|
107 |
-
### Testing Data, Factors & Metrics
|
108 |
-
|
109 |
-
#### Testing Data
|
110 |
-
|
111 |
-
<!-- This should link to a Dataset Card if possible. -->
|
112 |
-
|
113 |
-
[More Information Needed]
|
114 |
-
|
115 |
-
#### Factors
|
116 |
-
|
117 |
-
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
|
118 |
-
|
119 |
-
[More Information Needed]
|
120 |
-
|
121 |
-
#### Metrics
|
122 |
-
|
123 |
-
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
124 |
-
|
125 |
-
[More Information Needed]
|
126 |
-
|
127 |
-
### Results
|
128 |
-
|
129 |
-
[More Information Needed]
|
130 |
-
|
131 |
-
#### Summary
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
## Model Examination [optional]
|
136 |
-
|
137 |
-
<!-- Relevant interpretability work for the model goes here -->
|
138 |
-
|
139 |
-
[More Information Needed]
|
140 |
-
|
141 |
-
## Environmental Impact
|
142 |
-
|
143 |
-
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
144 |
-
|
145 |
-
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
146 |
-
|
147 |
-
- **Hardware Type:** [More Information Needed]
|
148 |
-
- **Hours used:** [More Information Needed]
|
149 |
-
- **Cloud Provider:** [More Information Needed]
|
150 |
-
- **Compute Region:** [More Information Needed]
|
151 |
-
- **Carbon Emitted:** [More Information Needed]
|
152 |
-
|
153 |
-
## Technical Specifications [optional]
|
154 |
-
|
155 |
-
### Model Architecture and Objective
|
156 |
-
|
157 |
-
[More Information Needed]
|
158 |
-
|
159 |
-
### Compute Infrastructure
|
160 |
-
|
161 |
-
[More Information Needed]
|
162 |
-
|
163 |
-
#### Hardware
|
164 |
-
|
165 |
-
[More Information Needed]
|
166 |
-
|
167 |
-
#### Software
|
168 |
-
|
169 |
-
[More Information Needed]
|
170 |
-
|
171 |
-
## Citation [optional]
|
172 |
-
|
173 |
-
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
174 |
-
|
175 |
-
**BibTeX:**
|
176 |
-
|
177 |
-
[More Information Needed]
|
178 |
-
|
179 |
-
**APA:**
|
180 |
-
|
181 |
-
[More Information Needed]
|
182 |
-
|
183 |
-
## Glossary [optional]
|
184 |
-
|
185 |
-
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
186 |
-
|
187 |
-
[More Information Needed]
|
188 |
-
|
189 |
-
## More Information [optional]
|
190 |
-
|
191 |
-
[More Information Needed]
|
192 |
-
|
193 |
-
## Model Card Authors [optional]
|
194 |
-
|
195 |
-
[More Information Needed]
|
196 |
-
|
197 |
-
## Model Card Contact
|
198 |
-
|
199 |
-
[More Information Needed]
|
|
|
33 |
- **Finetuned from model [optional]:** https://huggingface.co/OpenPipe/mistral-ft-optimized-1227
|
34 |
|
35 |
|
36 |
+
## Usage
|
37 |
+
#### Install
|
38 |
+
```py
|
39 |
+
pip install transformers accelerate bitsandbytes flash-attn -Uq
|
40 |
+
```
|
41 |
+
### Load the Model
|
42 |
+
```py
|
43 |
+
import torch
|
44 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
45 |
+
# from transformers import LlamaTokenizer, MixtralForCausalLM
|
46 |
+
import bitsandbytes, flash_attn
|
47 |
+
|
48 |
+
tokenizer = AutoTokenizer.from_pretrained('Andyrasika/mistral-ft-optimized-dpo', trust_remote_code=True)
|
49 |
+
model = AutoModelForCausalLM.from_pretrained(
|
50 |
+
"Andyrasika/mistral-ft-optimized-dpo",
|
51 |
+
torch_dtype=torch.float16,
|
52 |
+
device_map="auto",
|
53 |
+
load_in_8bit=False,
|
54 |
+
load_in_4bit=True,
|
55 |
+
use_flash_attention_2=True
|
56 |
+
)
|
57 |
+
|
58 |
+
prompts = [
|
59 |
+
"""<|im_start|>system
|
60 |
+
You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.<|im_end|>
|
61 |
+
<|im_start|>user
|
62 |
+
Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|im_end|>
|
63 |
+
<|im_start|>assistant""",
|
64 |
+
]
|
65 |
+
|
66 |
+
for chat in prompts:
|
67 |
+
print(chat)
|
68 |
+
input_ids = tokenizer(chat, return_tensors="pt").input_ids.to("cuda")
|
69 |
+
generated_ids = model.generate(input_ids, max_new_tokens=750, temperature=0.8, repetition_penalty=1.1, do_sample=True, eos_token_id=tokenizer.eos_token_id)
|
70 |
+
response = tokenizer.decode(generated_ids[0][input_ids.shape[-1]:], skip_special_tokens=True, clean_up_tokenization_space=True)
|
71 |
+
print(f"Response: {response}")
|
72 |
+
```
|
73 |
+
### Output
|
74 |
+
```
|
75 |
+
<|im_start|>system
|
76 |
+
You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.<|im_end|>
|
77 |
+
<|im_start|>user
|
78 |
+
Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|im_end|>
|
79 |
+
<|im_start|>assistant
|
80 |
+
/usr/local/lib/python3.10/dist-packages/bitsandbytes/nn/modules.py:226: UserWarning: Input type into Linear4bit is torch.float16, but bnb_4bit_compute_dtype=torch.float32 (default). This will lead to slow inference or training speed.
|
81 |
+
warnings.warn(f'Input type into Linear4bit is torch.float16, but bnb_4bit_compute_dtype=torch.float32 (default). This will lead to slow inference or training speed.')
|
82 |
+
Response:
|
83 |
+
Goku's eyes widened as he saw Kirby standing next to Majin Buu. "How did you two...?" he stammered.
|
84 |
+
|
85 |
+
Kirby waved a hand nonchalantly. "We got to know each other at a cosplay event," he explained. "Turns out we have a lot in common."
|
86 |
+
|
87 |
+
"Like wanting to destroy the world?" Goku asked, puzzled.
|
88 |
+
|
89 |
+
Majin Buu chuckled evilly. "Exactly! We thought it might be fun to team up and create chaos together. It'll be so much easier with both of us working in tandem!"
|
90 |
+
|
91 |
+
Goku frowned. "That sounds like a terrible idea. But why would Kirby want to do something like that?"
|
92 |
+
|
93 |
+
Kirby shrugged. "I don't know. Maybe he just wants some excitement." He glanced over at Majin Buu. "Right, Buu?"
|
94 |
+
|
95 |
+
Buu nodded eagerly. "Yeah! Let's show Goku what real destruction looks like!" With that, they charged towards him, ready for battle.
|
96 |
+
|
97 |
+
As Goku prepared to fight, he wondered if there was any way to convince them otherwise. But as he exchanged blows with the powerful duo, he knew there was only one option left - to defeat them and restore peace to Earth.
|
98 |
+
|
99 |
+
With a mighty roar, Goku unleashed a devastating Kamehameha wave, which collided with Buu's chaotic energy. The two forces clashed violently, sending shockwaves rippling through the landscape.
|
100 |
+
|
101 |
+
In the end, Goku emerged victorious, leaving Kirby and Majin Buu defeated and exhausted. As Goku helped them recover, he began to suspect that maybe they weren't so bad after all. Perhaps, with time, he could help them see the error in their ways and find a more constructive path forward. For now, though, it was important to ensure their defeat and restore balance to the world.
|
102 |
+
<|im_end|>
|
103 |
+
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|