Update README.md
Browse files
README.md
CHANGED
@@ -44,6 +44,51 @@ For further details, please refer to the following resources:
|
|
44 |
| UnifiedReward (Ours) | Pair/Point | β | β |β|β|
|
45 |
|
46 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
47 |
### Quick Start
|
48 |
All pair rank and point score inference codes are provided in our [github](https://github.com/CodeGoat24/UnifiedReward).
|
49 |
|
|
|
44 |
| UnifiedReward (Ours) | Pair/Point | β | β |β|β|
|
45 |
|
46 |
|
47 |
+
**VLRewardBench** Comparison Results
|
48 |
+
|
49 |
+
| Models | General | Hallu. | Reason. | Overall Accuracy | Macro Accuracy |
|
50 |
+
|----------------------|---------|--------|---------|------------------|---------------|
|
51 |
+
| Gemini-1.5-Pro | 50.8 | 72.5 | 64.2 | 67.2 | 62.5 |
|
52 |
+
| GPT-4o | 49.1 | 67.6 | **70.5** | 65.8 | 62.4 |
|
53 |
+
| LLaVA-Critic | 47.4 | 38.5 | 53.8 | 46.9 | 46.6 |
|
54 |
+
| OV-7B | 32.2 | 20.1 | 57.1 | 29.6 | 36.5 |
|
55 |
+
| **UnifiedReward** | 60.6 | 78.4 | 60.5 | 66.1 | 66.5 |
|
56 |
+
| **UnifiedReward-v1.5** | **68.1** | **84.4** | 59.5 | **70.1** | **70.7** |
|
57 |
+
|
58 |
+
|
59 |
+
---
|
60 |
+
|
61 |
+
**GenAI-Bench(Image)** Comparison Results
|
62 |
+
|
63 |
+
| Method | GenAI-Bench | |
|
64 |
+
|------------------|------------|--------|
|
65 |
+
| | tau | diff |
|
66 |
+
| PickScore | 53.2 | 67.2 |
|
67 |
+
| HPSv2 | 51.6 | 68.4 |
|
68 |
+
| ImageReward | 47.8 | 65.0 |
|
69 |
+
| VisionReward | 46.8 | 66.4 |
|
70 |
+
| OV-7B | 39.7 | 53.2 |
|
71 |
+
| **UnifiedReward** | 54.8 | 70.9 |
|
72 |
+
| **UnifiedReward-v1.5** | **58.9** | **72.4** |
|
73 |
+
|
74 |
+
|
75 |
+
---
|
76 |
+
|
77 |
+
|
78 |
+
**GenAI-Bench(Video)** and **VideoGen-Reward** Comparison Results
|
79 |
+
|
80 |
+
| Method | GenAI-Bench | | VideoGen-Reward | |
|
81 |
+
|------------------|------------|--------|-----------------|--------|
|
82 |
+
| | tau | diff | tau | diff |
|
83 |
+
| VideoScore | 46.2 | 70.6 | 42.1 | 49.9 |
|
84 |
+
| LiFT | 41.2 | 60.1 | 40.6 | 58.3 |
|
85 |
+
| VisionReward | 52.1 | 73.1 | 57.4 | 68.2 |
|
86 |
+
| VideoReward | 50.2 | 73.3 | 60.1 | 73.9 |
|
87 |
+
| OV-7B | 40.8 | 51.4 | 40.4 | 50.2 |
|
88 |
+
| **UnifiedReward** | 60.7 | 77.2 | 66.6 | 79.3 |
|
89 |
+
| **UnifiedReward-v1.5** | **61.7** | **78.5** | **67.0** | **80.5** |
|
90 |
+
|
91 |
+
|
92 |
### Quick Start
|
93 |
All pair rank and point score inference codes are provided in our [github](https://github.com/CodeGoat24/UnifiedReward).
|
94 |
|