RaushanTurganbay HF staff commited on
Commit
5cfc077
·
verified ·
1 Parent(s): ed45ac3

Update pipeline example

Browse files
Files changed (1) hide show
  1. README.md +28 -2
README.md CHANGED
@@ -32,11 +32,37 @@ other versions on a task that interests you.
32
 
33
  ### How to use
34
 
35
- Here's the prompt template for this model:
 
 
36
  ```
37
  "[INST] <image>\nWhat is shown in this image? [/INST]"
38
  ```
39
- You can load and use the model like following:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  ```python
41
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
42
  import torch
 
32
 
33
  ### How to use
34
 
35
+ Here's the prompt template for this model but we recomment to use the chat templates to format the prompt with `processor.apply_chat_template()`.
36
+ That will apply the correct template for a given checkpoint for you.
37
+
38
  ```
39
  "[INST] <image>\nWhat is shown in this image? [/INST]"
40
  ```
41
+
42
+ To run the model with the `pipeline`, see the below example:
43
+
44
+ ```python
45
+ from transformers import pipeline
46
+
47
+ pipe = pipeline("image-text-to-text", model="llava-hf/llava-v1.6-mistral-7b-hf")
48
+ messages = [
49
+ {
50
+ "role": "user",
51
+ "content": [
52
+ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg"},
53
+ {"type": "text", "text": "What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud"},
54
+ ],
55
+ },
56
+ ]
57
+
58
+ out = pipe(text=messages, max_new_tokens=20)
59
+ print(out)
60
+ >>> [{'input_text': [{'role': 'user', 'content': [{'type': 'image', 'url': 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg'}, {'type': 'text', 'text': 'What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud'}]}], 'generated_text': 'Lava'}]
61
+ ```
62
+
63
+
64
+ You can also load and use the model like following:
65
+
66
  ```python
67
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
68
  import torch