agemagician
commited on
Commit
•
cf9fb77
1
Parent(s):
ea0ed7f
update readme with more examples
Browse files
README.md
CHANGED
@@ -124,6 +124,10 @@ The model is mostly meant to be fine-tuned on a supervised dataset. See the [mod
|
|
124 |
|
125 |
### How to use
|
126 |
|
|
|
|
|
|
|
|
|
127 |
```python
|
128 |
from transformers import T5Tokenizer, LongT5Model
|
129 |
|
@@ -136,6 +140,70 @@ outputs = model(**inputs)
|
|
136 |
last_hidden_states = outputs.last_hidden_state
|
137 |
```
|
138 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
139 |
### BibTeX entry and citation info
|
140 |
|
141 |
```bibtex
|
|
|
124 |
|
125 |
### How to use
|
126 |
|
127 |
+
### How to use
|
128 |
+
|
129 |
+
The following shows how one can extract the last hidden representation for the model.
|
130 |
+
|
131 |
```python
|
132 |
from transformers import T5Tokenizer, LongT5Model
|
133 |
|
|
|
140 |
last_hidden_states = outputs.last_hidden_state
|
141 |
```
|
142 |
|
143 |
+
The following shows how one can predict masked passages using the different denoising strategies.
|
144 |
+
|
145 |
+
### S-Denoising
|
146 |
+
|
147 |
+
For *S-Denoising*, please make sure to prompt the text with the prefix `[S2S]` as shown below.
|
148 |
+
|
149 |
+
```python
|
150 |
+
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
|
151 |
+
import torch
|
152 |
+
|
153 |
+
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-base", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
|
154 |
+
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-base")
|
155 |
+
|
156 |
+
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
|
157 |
+
|
158 |
+
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
|
159 |
+
|
160 |
+
outputs = model.generate(inputs, max_length=200)
|
161 |
+
|
162 |
+
print(tokenizer.decode(outputs[0]))
|
163 |
+
```
|
164 |
+
|
165 |
+
### R-Denoising
|
166 |
+
|
167 |
+
For *R-Denoising*, please make sure to prompt the text with the prefix `[NLU]` as shown below.
|
168 |
+
|
169 |
+
```python
|
170 |
+
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
|
171 |
+
import torch
|
172 |
+
|
173 |
+
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-base", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
|
174 |
+
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-base")
|
175 |
+
|
176 |
+
input_string = "[NLU] Mr. Dursley was the director of a firm called <extra_id_0>, which made <extra_id_1>. He was a big, solid man with a bald head. Mrs. Dursley was thin and <extra_id_2> of neck, which came in very useful as she spent so much of her time <extra_id_3>. The Dursleys had a small son called Dudley and <extra_id_4>"
|
177 |
+
|
178 |
+
inputs = tokenizer(input_string, return_tensors="pt", add_special_tokens=False).input_ids.to("cuda")
|
179 |
+
|
180 |
+
outputs = model.generate(inputs, max_length=200)
|
181 |
+
|
182 |
+
print(tokenizer.decode(outputs[0]))
|
183 |
+
```
|
184 |
+
|
185 |
+
### X-Denoising
|
186 |
+
|
187 |
+
For *X-Denoising*, please make sure to prompt the text with the prefix `[NLG]` as shown below.
|
188 |
+
|
189 |
+
```python
|
190 |
+
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
|
191 |
+
import torch
|
192 |
+
|
193 |
+
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-base", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
|
194 |
+
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-base")
|
195 |
+
|
196 |
+
input_string = "[NLG] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man wiht a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she
|
197 |
+
spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere. <extra_id_0>"
|
198 |
+
|
199 |
+
model.cuda()
|
200 |
+
inputs = tokenizer(input_string, return_tensors="pt", add_special_tokens=False).input_ids.to("cuda")
|
201 |
+
|
202 |
+
outputs = model.generate(inputs, max_length=200)
|
203 |
+
|
204 |
+
print(tokenizer.decode(outputs[0]))
|
205 |
+
```
|
206 |
+
|
207 |
### BibTeX entry and citation info
|
208 |
|
209 |
```bibtex
|