update
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ tags:
|
|
20 |
[\[🗨️ Chat Demo\]](http://eagle-vlm.xyz/) [\[🤗 HF Demo\]](TODO)
|
21 |
## Introduction
|
22 |
|
23 |
-
We are thrilled to release our latest Eagle2 series Vision-Language Model. Open-source Vision-Language Models (VLMs) have made significant strides in narrowing the gap with proprietary models. However, critical details about data strategies and implementation are often missing, limiting reproducibility and innovation. In this project, we focus on VLM post-training from a data-centric perspective, sharing insights into building effective data strategies from scratch. By combining these strategies with robust training recipes and model design, we introduce
|
24 |
|
25 |
|
26 |
|
@@ -83,7 +83,7 @@ We provide a [demo inference script](./demo.py) to help you quickly start using
|
|
83 |
pip install transformers==4.37.2
|
84 |
pip install flash-attn
|
85 |
```
|
86 |
-
**Note**: Latest version of transformers
|
87 |
|
88 |
### 1. Prepare the Model worker
|
89 |
|
|
|
20 |
[\[🗨️ Chat Demo\]](http://eagle-vlm.xyz/) [\[🤗 HF Demo\]](TODO)
|
21 |
## Introduction
|
22 |
|
23 |
+
We are thrilled to release our latest Eagle2 series Vision-Language Model. Open-source Vision-Language Models (VLMs) have made significant strides in narrowing the gap with proprietary models. However, critical details about data strategies and implementation are often missing, limiting reproducibility and innovation. In this project, we focus on VLM post-training from a data-centric perspective, sharing insights into building effective data strategies from scratch. By combining these strategies with robust training recipes and model design, we introduce Eagle2, a family of performant VLMs. Our work aims to empower the open-source community to develop competitive VLMs with transparent processes.
|
24 |
|
25 |
|
26 |
|
|
|
83 |
pip install transformers==4.37.2
|
84 |
pip install flash-attn
|
85 |
```
|
86 |
+
**Note**: Latest version of transformers is not compatible with the model.
|
87 |
|
88 |
### 1. Prepare the Model worker
|
89 |
|