Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,36 @@ For the data details of these benchmarks, please refer to [VideoScore-Bench](htt
|
|
28 |
|
29 |
- VideoScore-v1.1 is a **regression version** model.
|
30 |
|
31 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
32 |
## Usage
|
33 |
### Installation
|
34 |
```
|
|
|
28 |
|
29 |
- VideoScore-v1.1 is a **regression version** model.
|
30 |
|
31 |
+
|
32 |
+
## Evaluation Results
|
33 |
+
|
34 |
+
We test VideoScore-v1.1 on VideoFeedback-test and take Spearman corrleation between model's output and human ratings
|
35 |
+
averaged among all the evaluation aspects as indicator.
|
36 |
+
|
37 |
+
The evaluation results are shown below:
|
38 |
+
|
39 |
+
| metric | VideoFeedback-test |
|
40 |
+
|:-----------------:|:------------------:|
|
41 |
+
| VideoScore-v1.1 | **74.0** |
|
42 |
+
| Gemini-1.5-Pro | 22.1 |
|
43 |
+
| Gemini-1.5-Flash | 20.8 |
|
44 |
+
| GPT-4o | <u>23.1</u> |
|
45 |
+
| CLIP-sim | 8.9 |
|
46 |
+
| DINO-sim | 7.5 |
|
47 |
+
| SSIM-sim | 13.4 |
|
48 |
+
| CLIP-Score | -7.2 |
|
49 |
+
| LLaVA-1.5-7B | 8.5 |
|
50 |
+
| LLaVA-1.6-7B | -3.1 |
|
51 |
+
| X-CLIP-Score | -1.9 |
|
52 |
+
| PIQE | -10.1 |
|
53 |
+
| BRISQUE | -20.3 |
|
54 |
+
| Idefics2 | 6.5 |
|
55 |
+
| MSE-dyn | -5.5 |
|
56 |
+
| SSIM-dyn | -12.9 |
|
57 |
+
|
58 |
+
The best in VideoScore series is in bold and the best in baselines is underlined.
|
59 |
+
|
60 |
+
|
61 |
## Usage
|
62 |
### Installation
|
63 |
```
|