Update README.md
Browse files
README.md
CHANGED
@@ -50,7 +50,7 @@ AttributeError: module 'torch.library' has no attribute 'register_fake'
|
|
50 |
|
51 |
### Quickstart
|
52 |
We use a special RuQwen2ForCausalLM class to work with this model:
|
53 |
-
```
|
54 |
from transformers import Qwen2ForCausalLM, AutoConfig, AutoTokenizer
|
55 |
import torch
|
56 |
|
@@ -92,7 +92,7 @@ class RuQwen2ForCausalLM(Qwen2ForCausalLM):
|
|
92 |
super().save_pretrained(save_directory, *args, **kwargs)
|
93 |
```
|
94 |
Here provides a code snippet with apply_chat_template to show you how to load the tokenizer and model and how to generate contents.
|
95 |
-
```
|
96 |
def generate(messages):
|
97 |
input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device)
|
98 |
output = model.generate(input_ids,
|
|
|
50 |
|
51 |
### Quickstart
|
52 |
We use a special RuQwen2ForCausalLM class to work with this model:
|
53 |
+
```python
|
54 |
from transformers import Qwen2ForCausalLM, AutoConfig, AutoTokenizer
|
55 |
import torch
|
56 |
|
|
|
92 |
super().save_pretrained(save_directory, *args, **kwargs)
|
93 |
```
|
94 |
Here provides a code snippet with apply_chat_template to show you how to load the tokenizer and model and how to generate contents.
|
95 |
+
```python
|
96 |
def generate(messages):
|
97 |
input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device)
|
98 |
output = model.generate(input_ids,
|