Thirawarit
commited on
Commit
•
a7bf642
1
Parent(s):
fe0ee6d
Update README.md
Browse files
README.md
CHANGED
@@ -62,15 +62,24 @@ The model was fine-tuned on several datasets:
|
|
62 |
|
63 |
- **Accuracy on Manual-VQA Tasks**: 30.34%
|
64 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
65 |
## Usage
|
66 |
To use the model with the Hugging Face `transformers` library:
|
67 |
|
68 |
```python
|
69 |
-
from transformers import
|
|
|
|
|
|
|
|
|
70 |
|
71 |
-
# Load the tokenizer and model
|
72 |
-
tokenizer = AutoTokenizer.from_pretrained("nectec/Pathumma-llm-vision-1.0.0")
|
73 |
-
model = AutoModel.from_pretrained("nectec/Pathumma-llm-vision-1.0.0")
|
74 |
N = 5
|
75 |
|
76 |
processor = AutoProcessor.from_pretrained(
|
|
|
62 |
|
63 |
- **Accuracy on Manual-VQA Tasks**: 30.34%
|
64 |
|
65 |
+
## Required Libraries
|
66 |
+
|
67 |
+
Before you start, ensure you have the following libraries installed:
|
68 |
+
|
69 |
+
```
|
70 |
+
pip install git+https://github.com/andimarafioti/transformers.git@idefics3
|
71 |
+
```
|
72 |
+
|
73 |
## Usage
|
74 |
To use the model with the Hugging Face `transformers` library:
|
75 |
|
76 |
```python
|
77 |
+
from transformers import AutoProcessor, Idefics3ForConditionalGeneration
|
78 |
+
|
79 |
+
DEVICE = f"cuda" if torch.cuda.is_available() else 'cpu' if torch.cpu.is_available() else 'mps'
|
80 |
+
display(DEVICE)
|
81 |
+
if DEVICE == 'cuda': display(torch.cuda.device_count())
|
82 |
|
|
|
|
|
|
|
83 |
N = 5
|
84 |
|
85 |
processor = AutoProcessor.from_pretrained(
|