patrickvonplaten osanseviero commited on
Commit
83e9aa1
1 Parent(s): adb7468

Add minor reference to transformers (#7)

Browse files

- Add minor reference to transformers (6d09a724d62369e03273260e91471f90885a3626)
- Update README.md (28ef5dc97e80c593a11786c42a287da995ff6c87)


Co-authored-by: Omar Sanseviero <[email protected]>

Files changed (1) hide show
  1. README.md +16 -1
README.md CHANGED
@@ -13,7 +13,7 @@ Mistral-7B-v0.3 has the following changes compared to [Mistral-7B-v0.2](https://
13
 
14
  ## Installation
15
 
16
- It is recommended to use `mistralai/Mistral-7B-Instruct-v0.3` with [mistral-inference](https://github.com/mistralai/mistral-inference)
17
 
18
  ```
19
  pip install mistral_inference
@@ -115,6 +115,21 @@ result = tokenizer.instruct_tokenizer.tokenizer.decode(out_tokens[0])
115
  print(result)
116
  ```
117
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
118
  ## Limitations
119
 
120
  The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance.
 
13
 
14
  ## Installation
15
 
16
+ It is recommended to use `mistralai/Mistral-7B-Instruct-v0.3` with [mistral-inference](https://github.com/mistralai/mistral-inference). For HF transformers code snippets, please keep scrolling.
17
 
18
  ```
19
  pip install mistral_inference
 
115
  print(result)
116
  ```
117
 
118
+ ## Generate with `transformers`
119
+
120
+ If you want to use Hugging Face `transformers` to generate text, you can do something like this.
121
+
122
+ ```py
123
+ from transformers import pipeline
124
+
125
+ messages = [
126
+ {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
127
+ {"role": "user", "content": "Who are you?"},
128
+ ]
129
+ chatbot = pipeline("text-generation", model="mistralai/Mistral-7B-Instruct-v0.3")
130
+ chatbot(messages)
131
+ ```
132
+
133
  ## Limitations
134
 
135
  The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance.